DOC

[doc] Sparse

By Juan Collins,2014-09-09 07:59
7 views 0
[doc] Sparse

    Sparse

JControlTheoryAppl20097f2)163168

    ;D0I10.1007/s1176800970265

    ;parserepresentatlonDaSecl0nDr01ectionmethodln1-11?--l1?

    ;onlineleastsquaressupportvectormachines

    ;LijuanLI,HongyeSU,JianCHU

    ;(1.StateKeyLaboratoryofIndustrialControlTechnology,InstituteofAdvancedProcessControl,Zh@angUniversity

    ;HangzhouZhejiang310027,China;

    ;2.CollegeofAutomation,NanjingUniversityofTechnology,NanjingJiangsu210009,China)

    ;Abstract:Asparseapproximationalgorithmbasedoffprojectionispresentedinthispaperinordertoovercomethe

    ;limitationofthenonsparsityofleastsquaressupportvectormachines(LSSVM,.Thenewinputsareprojectedintothe

    ;subspacespannedbypreviousbasisvectorsrBV)andthoseinputswhosesquareddistancefromthesubspaceishigher

    ;thanathresholdareaddedintheBVset.whileothersarerejected.Thisconsequentlyresultsinthesparseapproximation.

    ;Inaddition.arecursiveapproachtodeletinganexitingvectorintheBVsetispro

posed.ThentheonlineLSSVM.sparse

    ;approximationandBVremovalarecombinedtoproducethesparseonlineLSSVMalgorithmthatcancontrolthesize

    ;ofmemoryi~espectiveoftheprocesseddatasize.ThesuggestedalgorithmisappliedintheonlinemodelingofapH

    ;neutralizingprocessandtheisomerizationplantofarefinery,respectively.Thedetailedcomparisonofcomputingtime

    ;andprecisionisalsogivenbetweenthesuggestedalgorithmandthenonsparseone.Theresultsshowthattheproposed

    ;algorithmgreatlyimprovesthesparsityjustwithlittlecostofprecision. ;Keywords:Leastsquaressupportvectormachines;Projection;Sparsity;pHneutralizingprocess;Isomerization

    ;1Introduction

    ;Supportvectormachines(SVM)141havedrawnmuch

    ;attentionforthehighgenera1izationabilityinclassification ;andregressionproblems.Likeotherkernelbasedmethods,

    ;supportvectormachinescombineahighflexibilityofthe

    ;modelbyworkinginhighdimensionalfeaturespaceswith

    ;thesimplicitythatalloperationsare”kernelized”,i.e.they

    ;areperformedintheinputspace(1owerdimensionalspace)

    ;usingpositivedefinitekernels.Theleastsquaressupport

    ;vectormachine(LSSVM)esofalmostalltheso..

    ;1utionvector(whileinstandardSVMmanyelementsinthe ;solutionvectorwillbeexactlyequa1tozero).Thisseems ;toresultinahugeincreaseincomputationalcostwiththe ;numberoftrainingdataandprecludetheapplications0fLS

    ;SVMtolargedatasets.Afliterativemethodfi.e.conjugate ;gradientmethod1forsolvingthelargescaleproblemshas ;beendiscussedinf61.AdetailedoverviewofLSSVMand

    ;itsapplicationscanbefoundin81.

    ;InLSSVManditsmodifiedversions.inversionofalarge ;matrix1sinvolvedwithquitelongcomputingtime.which ;maynotsatisfvthedemandofrealtimemodeling.ARecu~

    ;siveLSSVM(RLSSVM)algorithm91,whichcanavoid

    ;thecomputattonoftheinversionoflargematrix.showsits ;potentialadvantageinonlinemodeling.However.thenon

    ;sparsityofRLSSVMprecludesitsapplicationinonline ;modeling.Thatis.thecomputingtimewouldexceedthe ;samplingtimeatacertainsamplingtimedespitetheabsence ;ofinversionbecausethenumberofparameterstobesolved ;scaleswiththesizeofthetrainingsetandlikewisethecom

    ;putationandmemoryrequirement.

    ;Theideaofprojectingtheinputstoalinearsubspace ;specifiedbyasubsethasbeenproposedbyWlahha10].L.

;Csat6hassuccessfullyrealizedthesparsityofGaussianPro

    ;cess(GP)withtheprojectionmethod[11].Inthispaper,we ;presentanapproachusingprojectionmethodtoovercome ;thesparsityproblemofRLSSVM.Asanewinputbeing

    ;addedintothetrainingset,itsprojectiontothefeaturespace ;spannedbypreviousinputsisconsideredandtheresidual ;oftheprojectiondetermineswhetheritwouldbeaddedinto ;thebasisvectorsfBVs.consistingofpreviousinputs).By ;excludingsomenewinputsoutofthetrainingset,thespar

    ;sityisobtained.Inaddition,thepaperpresentsarecursive ;methodtodeleteanexistentbasisvectorandthusenablethe ;algorithmtocontrolthesizeofmemoryarbitrarily. ;2Recursiveleastsquaressupportvectorma-

    ;chine

    ;2.1Basicleastsquaressupportvectormachine ;Supposeagivenregressiondataset{(z,Yi))v_,where

    ;Nisthetotalnumberoftrainingdatapairs,?isthe

    ;regressionvectorand?istheoutput.Accordingtothe

    ;SVMtheoryofVapnik1,2],theinputspaceismapped

    ;intoflfeaturespaceZwiththenonlinearfunctionf) ;beingthecorrespondingmappingfunction.Inthefeature ;spacewetakethefollowingforilltoestimatetheunknown

;function,

    ;y(x)=wT()+bwithw?z,b?,(1)

    ;Received26January2007;revised16May2008. ;ThisworkwassupportedbytheNationalCreativeResearchGroupsScienceF

    oundationofChina(NCRGSFC:60721062)andNationalBasicResearch

    ;ProgramofChina(973Program)(No.2007CB714000). ;,|.LIeta1./JControlTheo~’Appl20097(2)163—168

    ;wherevectorwandscalarbaretheparameterstobeidenti(i=1,2

    ;fled.Theoptimizationproblemisdefinedasfollows: ;minf.+7>.,

    ;wy

    ;subjectto

    ;Yi=wT()+b+ci,i=1,2,…,?,(3)

    ;whereeiistheerrorbetweenactualoutputandpredictive ;outputoftheithdata.

    ;TheLSSVMmodelofthedatasetcanbegivenby

    ;?

    ;()=?c~K(x,xi)+b,(4)

    ;=

    ;1

    ;where?(i=1,2,?,N)areLagrangemultipliers,

    ;K(x,xi)(i=1,2,…,N)areanykernelfunctionssatis

    ;fyingtheMercercondition[3.Thetypicalkernelfunctions ;arelinear,polynomial,radialbasisfunction(RBF),MLP ;functions.etc.Analyticalsolutionsofparameterst-l,i?

    ;where

    ;ON+1

    ;P?+1

    ;N)andbcanbeobtainedfrom

    ;[Y1Y2…N]T,Q=[oL1oL2…?]TmdYand12’一?J,Q12’一?J

    ;thesupposednonsingularmatrix

    ;I鼻小?

    ;where1=[1l…1]TandisaN×Nsymmetricma—

    ;trix

    ;“=(.)(,)=K(xi,xS),i,J=1,2,.,?.(7)

    ;2.2Recursiveleastsquaressupportvectormachine ;Theorem1Considerthefunctionregressionproblem ;inSection2.1andsupposethatON=[ba]Tisthepa

    ;rametermatrixobtainedfromthetrainingsetconsistingof ;Npairsofdataby(5).LetPJv=.Whenanotherdata ;pair{?+1,YN+I)isaddedintothetrainingsettherecur- ;sivealgorithmof@canbeobtainedby

    ;l’,?+I]N+IPN~hT+1[YN+I一?+l9?]l

    ;I<?+1Jv+1[7]N+1P?+l?+1一白NYN+1rlN1I’

;1P?一『,/|v1PAr+1~N+IPNIIN+IPNtIY~r+1l

    ;l<?+1.v+1[77?+1PNkOTN+1RrCN+IPNP?]1)N+IJ’

    ;?+l=[11?+12?+1

    ;<?+=(?+?+)l,

    ;??+1],(10a)

    ;(?+1PNk~TN+1<1).(10c)

    ;SeedetailedproofofTheorem1in91.Withthistheo

    ;remthecomputationofparameterON=fbcanbe ;donerecursivelyby(8)andtheruntimeandmemoryre

    ;quirementdecreaseateachsamplingtime.

    ;3Sparserepresentation

    ;considerthemapf1ofaninputinthefeaturespace ;Z.TheDroiectionofalldatapointsintoZdefinesadata

    ;manifoldwithausuallymuchsmallerdimensionthanthe ;dimensionoftheembeddingspaceZ.Nevertheless.since ;thedataaremoredispersedacrossthedimensionsthanin ;theinputspace,thefeaturespaceprojectlongivesmoretree

    ;doraforthelinearalgorithmstoasolutionfortheregression ;orclassj6catiOncase.

    ;Observingfromf41,wefindthattheresultingLSSVM

    ;modelisexpressedbasedonthetrainingdataset.i.e.inthe ;subspacespannedbythetrainingdatasetwithmuchlower

;dimensionthanthefeaturespaceZ.Thepointssetcom

    ;posedoftheresultingLS—SVMmode1f4,iscalled”Basis

    ;Vectors”fBVsetinthisPaper.

    ;TheonlineLSSVMalgorithmaddsasingleinput

    ;f?+)tothemodelateachstep,andthisadditionim

    ;pliestheextensionofthebasisvectorswhichexpressthe ;model(4).Nevertheless,ifthenewinputisinthelinearspan ;ofallpreviousinputs.theadditionofflnewvectorintheBV ;setisnotnecessary.Indeed,ifwewritethemap(?+1)

    ;aR

    ;~(azN+1)=N+1

    ;(8)

    ;(9)

    ;where仑?+1=l]2...N1arethecoordinatesof

    ;projectiontothesubspacespannedbypreviousinputs.If ;wesubstitute(11)into(4),anequivalentrepresentationof ;themodelcanbemadethatusesonlythefirstNbasisvec

    ;tors.Therefore,thenewvectorcanbepassedwithoutbeing ;addedintotheBVset.Unfortunately,f11)doesnotholdfor ;mostofthecases.Wedecomposethenewfeaturevector ;(?+1)asa1inearcombinationofthepreviousfeatures ;andaresidualitem,

;N

    ;(?+1):?+1+6N+1~0res=?ei~)i+eN+l~9res

    ;i:1

    ;:

    ;龟?+1?+eN+l~res,(12)

    ;whereJv+1=?+1?istheprojectionof?+1to

    ;thepreviousinputs.Allpreviousfeaturevectorshavebeen ;groupedintothevector?=[l2????]andres

    ;istheresidual,i.e.theunitvectororthogonaltothefirstN ;featurevectors.Parameter

    ;E?+1?+1.?+1II03)

    ;isthesquaredlengthoftheresidua1.Fig.1providesavisual

    ;izationoftheinputsandprojectioninthefeaturespace.The ;residualisplottedbytheort

    ;,

    ;hogonalvectorwithsquared

    ;lengthEN+I.,(1,?,?)isthesubspacespannedby

    ;previousinputs.

    ;Fig.1Visualizationoftheprojectiontothesubspace. ;Ifthemap?+1ofthenewinputislocatedjustinthe ;subspacespannedbythepreviousinputs,E?+1=0holds

    ;andthenewinputcanbekeptoutoftheBVset.Conse

;??

    ;LfJletnt.,JControlTheoryAppl20097(2,163-168 ;quently,parameter6N+lisdefinedaserrorcausedbythe ;projectionandapositivethreshold6l.wof6N+lisprede

    ;finedinouralgorithm.ThesparsityofRLSSVMiscon

    ;sideredinthefollowingmanner:onlythoseinputswith ;6N+1>61owareaddedintotheBVsetandothersarekept ;Out.

    ;3.1Computinginvolvedparameters

    ;TodeterminewhetheranewinputisaddedintotheBV ;set,theprojectioncoordinatesCNq-1=[l2???芭?]

    ;andthesquareddistanceE?+1needtobecomputedfirst.

    ;Nowwgwillsolvetheproblem.

    ;ExpandingJ?+1in(13)with?+1=仑?+l?and

    ;replacingthescalarproductsinthefeaturespacewiththe ;correspondingkernelfunctions,weget

    ;E?+1=lI?+1一?+lll

    ;._r

    ;=N+IN+I2e~V+lN+IN

    ;q-e~r+lNCN-+-I

    ;=+12+1kN+l++1KNCN+I,(14)

    ;whereKNisthekernelmatrixformedbyNpreviousin

Report this document

For any questions or suggestions please email
cust-service@docsford.com