[Home]
MPSVM
A Demo Matlab code for Manifold proximal SVM for SSC problem.
(You could Right-Click [Code] , and Save, then you can download the whole matlab code.)
Reference
Wei-Jie Chen*, Yuan-Hai Shao, Deng-Ke Xu and Yong-Feng Fu.
Manifold proximal support vector machine for semi-supervised classification[J]. Applied Intelligence. 2014,40(4):623-638.(SCI, IF:1.875)
Main Function
Need kernel function and laplacian function.
function [PredictY Times]= MPSVM(TestX,Data,FunPara)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% MPSVM: Manifold proximal SVM for SSC problem
%
% Predict_Y = MPSVM(TestX,DataTrain,FunPara)
%
% Input:
% TestX - Test Data matrix.
% Each row vector of fea is a data point.
%
% DataTrain - Struct value in Matlab------Training data.
% Data.X: Input dataset N-by-D data matrix.
% (N examples, D dimensions)
% Data.Y: Label Y matrix.
% (If Y is positive/negative, set 1/-1; else unlabel, set 0)
%
% FunPara - Struct value in Matlab. The fields in options
% that can be set:
% c1: [0,inf] penalty factor for empirical risks.
% c2: [0,inf] penalty factor for manifold term.
% kerfPara:Kernel parameters. See kernelfun.m.
%
% Output:
% Predict_Y - Predict value of the TestX.
%
%
% Examples:
%
% A = rand(100,2);
% B = rand(100,2)+ 2;
% X = [A;B];
% Y = [ones(4,1);zeros(96,1);-ones(4,1);zeros(96,1)];
% Data.X = X;Data.Y = Y;
% TestX = [rand(100,2);rand(100,2)+ 2;];
% TestY = [ones(100,1);-ones(100,1)];
% FunPara.p1=1;FunPara.p2=1;
% FunPara.kerfPara.type = 'lin';
% Predict_Y =MPSVM(TestX,Data,FunPara);
% Accuracy = sum(Predict_Y == TestY)/length(TestY)
%
%Reference:
% Wei-Jie Chen, Yuan-Hai Shao, Deng-Ke Xu and Hong Ning, "Manifold proximal
% support vector machine for semi-supervised classification" Submitted 2013
%
% version 1.0 --May/2013
% version 1.1 --Aug/2013
% Written by Wei-Jie Chen (wjcper2008@126.com)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Initailization
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
tic;
A = Data.X((Data.Y==1),:);
B = Data.X((Data.Y==-1),:);
K = Data.X;
m1 = size(A,1); m2 = size(B,1);
m = size(Data.X,1);
n = size(Data.X,2);
c1 = FunPara.p1; c2 = FunPara.p2;
e1 = ones(m1,1); e2=ones(m2,1); e = ones(m,1);
kerfPara = FunPara.kerfPara;
L = laplacian(12,Data.X);
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Cache kernel matrix
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
if ~strcmp(kerfPara.type,'lin')
K = kernelfun(Data.X,kerfPara);
A = kernelfun(A,kerfPara,Data.X);
B = kernelfun(B,kerfPara,Data.X);
end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Train classifier using Eig solver
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
H = [A,e1]; HH = H'*H;
G = [B,e2]; GG = G'*G;
J = [K,e];
M = J'*L*J;
HH1 = HH - c1*GG + c2*M;
GG1 = GG - c1*HH + c2*M;
[a1,a2]=eig(HH1);[a3,a4]=eig(GG1);
[~,index_v1]=min(diag(a2));
[~,index_v2]=min(diag(a4));
v1=a1(:,index_v1);
v2=a3(:,index_v2);
clear HH beta
Times= toc;
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Predict and output
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
m3 = size(TestX,1);
e = ones(m3,1);
if ~strcmp(kerfPara.type,'lin')
w1 = sqrt(v1(1:m)'*K*v1(1:m));
w2 = sqrt(v2(1:m)'*K*v2(1:m));
K = [kernelfun(TestX,kerfPara,Data.X),e];
else
w1 = sqrt(v1(1:n)'*v1(1:n));
w2 = sqrt(v2(1:n)'*v2(1:n));
K = [TestX, e];
end
PredictY = sign(abs(K*v2/w2)-abs(K*v1/w1));
end
Any question or advice please email to wjcper2008@126.com.
- Last updated: Aug 12, 2013