Authors: Xiangrong Zhang Yudi He Licheng Jiao Ruochen Liu Jie Feng Sisi Zhou
Publish Date: 2014/04/06
Volume: 43, Issue: 3, Pages: 633-655
Abstract
Dimension reduction has always been a major problem in many applications of machine learning and pattern recognition In this paper the scaling cut criterionbased supervised dimension reduction methods for data analysis are proposed The scaling cut criterion can eliminate the limit of the hypothesis that data distribution of each class is homoscedastic Gaussian To obtain a more reasonable mapping matrix and reduce the computational complexity local scaling cut criterionbased dimension reduction is raised which utilized the localization strategy of the input data The localized knearest neighbor graph is introduced which relaxes the withinclass variance and enlarges the betweenclass margin Moreover by kernelizing the scaling cut criterion and local scaling cut criterion both methods are extended to efficiently model the nonlinear variability of the data Furthermore the optimal dimension scaling cut criterion is proposed which can automatically select the optimal dimension for the dimension reduction methods The approaches have been tested on several datasets and the results have shown a better and efficient performance compared with other linear and nonlinear dimension reduction techniquesThis work was supported by the National Basic Research Program of China 973 Program Grant 2013CB329402 the National Natural Science Foundation of China Nos 61272282 61203303 and 61272279 the Program for New Century Excellent Talents in University NCET130948 and Fundamental Research Funds for the Central Universities Grant K50511020011
Keywords: