Promoting the accuracy of hyperspectral image classification is a crucial and complex issue. Hyperspectral image provides details of spectral variation of land surface with continuous spectral data. On the one hand, this characteristic is widely utilized to analyze and interpret different land-cover classes. On the other hand, the availability of large amounts of spectral space introduces challenging methodological issues, such as curse of dimensionality. Subspace ensemble systems, such as random subspace method (RSM), significantly outperform single classifiers in classifications involving hyperspectral image. However, two issues should be addressed to improve robustness and overall accuracy of the system. The first issue is diversity within subspace ensemble systems, and the second one is the classification accuracy of individual subspaces. In this paper, we adopt Support Vector Machine (SVM) as base classifier and proposed a novel subspace ensemble method, namely, optimal subspace SVM Ensemble, for hyperspectral image classification to improve the performance of RSM. Based on random subspace selection as the initial step, a two-step procedure is designed to avoid similarity within ensemble systems during the optimization of individual subspace accuracy. Instead of maximizing the diversity of ensemble by using a specific diversity measure, the first step employs the k-means cluster procedure according to the similarity of SVM patterns to classify random base classifiers. Second, an optimization process is implemented with Jeffries-Matusita (J-M) distance as criterion by selecting the optimal subspace from each group in the formal phase. The final label is decided based on majority voting of optimal subspaces. Experiments on two hyperspectral datasets reveal that the proposed OSSE obtains sound, robust, and overall accuracy compared with RSM and random forest method. In the first hyperspectral image, namely, the Pavia university data set, the maximum increases in Kappa coefficient and overall accuracy are about 0.04 and 2.64%, respectively, compared with those in RSM and about 0.15 and 12.75%, respectively, compared with those in random forest method. In the second hyperspectral image, namely, the Indian Pines data set, the maximum increases in Kappa coefficient and overall accuracy are about 0.02 and 1.00%, respectively, compared with those in RSM and about 0.13 and 11.12%, respectively, compared with those in random forest method. The combination of optimal subspaces improves the diversity of subspace system and the accuracy of individual classifiers and thus exhibits better performance, particularly when using limited samples, which is common in hyperspectral image classification. Basing on the results of different parameter settings in OSSE, we found two interesting issues related to the number of clustering and initial size of random subspaces. First, the optimal number of clusters in OSSE is stable when using specific hyperspectral remote sensing data. Hence, the optimal number of cluster could be assessed using the characteristics of remote sensing images. Second, similar to RSM, increasing the number of random subspaces minimally contributes to the improvement of classification accuracy in OSSE. Consequently, to decrease the time cost of computing, we should avoid selecting numerous random subspaces.
CITATION STYLE
Yang, K., Feng, X., Xiao, P., & Zhu, L. (2016). Optimal subspace ensemble with SVM for hyperspectral image Classification. Yaogan Xuebao/Journal of Remote Sensing, 20(3), 409–419. https://doi.org/10.11834/jrs.20165200
Mendeley helps you to discover research relevant for your work.