[Home]


MLSPTSVM

A Matlab code for Multi-class least squares recursive projection twin support vector machine . [Code]


Reference

Yuan-Hai Shao, Nai-Yang Deng*, Zhi-Min Yang. Least squares recursive projection twin support vector machine for classification[J]. Pattern Recognition, 2012, 45(6): 2299-2307.

Chun-Na Li, Yun-Feng Huang, He-Ji Wu, Yuan-Hai Shao, Zhi-Min Yang. Multiple recursive projection twin support vector machine for multi-class classification. International Journal of Machine Learning and Cybernetics, 2014,DOI: 10.1007/s13042-014-0289-2.


Main Function

[Predict_Y] = K_CLASSLSPTSVM(TestX,DataTrain,FunPara) %%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % MLSPTSVM: Multi-class least squares recursive projection twin support vector machine %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%% %%%%%%%%% Inputs: %%%%%%%%%% % TestX: Denote the input features of testing patterns. % DataTrain: Include the input features (DataTrain.X) and corresponding class labels (DataTrain.Y) of training patterns, and the number of classes (DataTrain.Type). % FunPara: Gather all the parameters we used, including penalty parameters c and v ( FunPara.c and FunPara.v), kernel type (lin or rbf)锛宬ernel width pars (only for rbf kernel) and desired number of projention axes(FunPara.loop). %%%%%%%%% Outputs: %%%%%%%%% % Predict_Y: The corresponding predict labels of TestX. %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%% Example: %%%%%%%%% % DataTrain.X = rand(100,10); % DataTrain.Y = [ones(20,1);2*ones(20,1);3*ones(20,1);4*ones(20,1);5*ones (20,1)]; % DataTrain.Type = 5; % FunPara.c = 10; FunPara.v = 9; % FunPara.kerfPara.type = 'rbf';FunPara.kerfPara.pars = 10; % TestX = rand(60,10); % FunPara.loop = 2; % [Predict_Y] = K_CLASSLSPTSVM(TestX,DataTrain,FunPara) %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%% Algorithm Starting %%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%% Training %%%%%%%%%%%%%%%%%%%% kerfPara = FunPara.kerfPara; m = size(DataTrain.X,1); n = size(TestX,1); TestBack = TestX; C = DataTrain.X; Dis = zeros(n,DataTrain.Type); % Distance matrix between projection of test patterns and projected centers. Predict_Y = zeros(n,1); for i = 1:DataTrain.Type % Determine the projection axes for each class. W1=[]; loop = FunPara.loop; SubclassIndex = find(DataTrain.Y==i); trainXA = DataTrain.X(SubclassIndex,:); % Training patterns of the i-th class. trainXB = DataTrain.X(setdiff(1:m,SubclassIndex),:); % Training patterns of all the classes except for the i-th class. if ~strcmp(kerfPara.type,'lin') % Nonliner kernel: Gaussian, Polynomial and so on. if m >= 1000 % Whether to use rectangular kernel technique. ReduceIndex = randperm(m,int16(0.05*m)); % Select 5% of the training patterns. C = DataTrain.X(ReduceIndex,:); end trainXA = kernelfun(trainXA,kerfPara,C); % Training patterns of the i-th class in the kernel space. trainXB = kernelfun(trainXB,kerfPara,C); TestX = kernelfun(TestBack,kerfPara,C); end w1 = zeros(size(trainXA,2),1); % Initialize the projection axis of the i-th class. centerA = mean(trainXA); while loop>0 % Seeking multiple projection axes for each class. trainXA = trainXA - trainXA*w1*w1'; % Update samples by recursive. trainXB = trainXB - trainXB*w1*w1'; % Update samples by recursive. m1 = size(trainXA,1); m2 = size(trainXB,1); e1 = ones(m1,1); e2 = ones(m2,1); I1 = eye(size(trainXB,2)); meanA = 1/m1*e1'*trainXA; H = trainXA - e1*meanA; G = trainXB - e2*meanA; Y = H'*H /FunPara.c + FunPara.v/FunPara.c*I1; %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%% Determine the projection axes for the i-th class %%%%%%%%%%%%%%%%% if ~strcmp(kerfPara.type,'lin') && m < 1000 % Whether to employ the SMW technique for nonlinear case. I2 = eye(size(trainXA,1)); I3 = eye(size(trainXB,1)); HH = I2 + H*H'/FunPara.v; YY = FunPara.c/FunPara.v*(I1 - H'*(HH\H)/FunPara.v); % Require one matrix inverse of order m1*m1. GG = I3 + G*YY*G'; w1 = (YY - YY*G'*(GG\G)*YY)*G'*e2; % Require two matrix inverses of order m1*m1 and m2*m2 (m=m1+m2). else w1 = (Y+G'*G)\G'*e2; % Require one matrix inverse of order m*m. end w1 = w1/norm(w1); W1 = [W1 w1]; % All the desired projection axes of the i-th class. loop = loop-1; end clear trainXA trianXB H G Y HH YY GG; for t = 1:n Dis(t,i) = norm(TestX(t,:)*W1 - centerA*W1); end end %%%%%%%%%%% output and predict %%%%%%%%% for s = 1:n Predict_Y(s,1) = find(Dis(s,:)==min(Dis(s,:))); end
Contacts


Any question or advice please email to shaoyuanhai21@163.com.


  • Last updated: Dec 27, 2014