[Home]


SGTSVM

A software of stochastic gradient twin support vector machine with C codes and Matlab code for stochastic gradient twin support vector machine.


Reference

Zhen Wang, Yuan-Hai Shao, Nai-Yang Deng, et. al. "Stochastic gradient twin support vector machine" Submitted 2017. You could Right-Click [Slide] , and Save, then you can download the slide of the paper.


Software with C codes

(You could Right-Click [Software] , and Save, then you can download the whole software by C++.).


Main Function

This is a matlab demo for SGTSVM. And it needs kernel function.

function testY= SGDtwinRand(testX,X,Y,P) %%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % SGTSVM: stochastic gradient twin support vector machine (linear version) % % testY= SGDtwinRand(testX,X,Y,P) % % Input: % X: Training data. % Y: Training data labels. (Y must include 1 and -1) % % Parameters - P.c1, P.c2, P.c3, P.c4. The fields in options that can be set: % P.c: (0,inf) Paramter to tune the weight. % P.T: max iteration. % P.tol: tolerance. % % Output: % testY: The prediction of testX. % Examples: % X=rand(50,10); % Y = [ones(25,1),-ones(25,1)]; % P.c1=1;P.c2=1;P.c3=1;P.c4=1;P.T=1000;P.tol=1e-4; % testY= SGDtwinRand(testX,X,Y,P) % Reference: % Zhen Wang, Yuan-Hai Shao, Nai-Yang Deng, et. al. "Stochastic gradient twin support vector % machine" Submitted 2016 % % Version 1.0 --Mar/2016 % % Written by Zhen Wang (wangzhen@imu.edu.cn) %%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Initailization %%%%%%%%%%%%%%%%%%%%%%%%%%%%%% function testY= SGDtwinRand(testX,X,Y,P) [m,n]=size(testX); sign1=find(Y==1); sign2=find(Y~=1); m1=length(sign1); m2=length(sign2); X=[X,ones(size(X,1),1)]; testX=[testX,ones(m,1)]; w1t=zeros(n+1,1); w2t=w1t; flag1=0; flag2=0; for i=1:P.T s1=sign1(randi(m1,1)); s2=sign2(randi(m2,1)); if flag1==0 w1=w1t; ind=1+X(s2,:)*w1; indS=0; if ind>0 indS=1; end grad=w1+(X(s1,:)*w1*P.c1*X(s1,:)+indS*P.c2*X(s2,:))'; w1t=w1-1/i*grad; if norm(w1t-w1) < P.tol flag1=1; end end if flag2==0 w2=w2t; ind=1-X(s1,:)*w2; indS=0; if ind>0 indS=1; end grad=w2+(X(s2,:)*w2*P.c3*X(s2,:)-indS*P.c4*X(s1,:))'; w2t=w2-1/i*grad; if norm(w2t-w2) < P.tol flag2=1; end end if flag1~=0 && flag2~=0 break; end end y1=abs(testX*w1)/norm(w1(1:n,1)); y2=abs(testX*w2)/norm(w2(1:n,1)); testY=ones(m,1); testY(y1>y2)=-1; end
Contacts


Any question or advice please email to wangzhen@imu.edu.cn or shaoyuanhai21@163.com.