DOC

Neural Network Compact Ensemble and Its Applications

By Margaret Ramos,2014-02-18 03:07
7 views 0
Neural Network Compact Ensemble and Its Applicationsand,Its,its

    Neural Network Compact Ensemble and Its

    Applications

    CHINESEJOURNAL0FMECHANICALENGINEERING

    Vo1.23,No.2,2010?209?

    DOI:10.3901/CJME.2010.02.209,availableonlineatwww.cjmenet.com;www.cjmenet.corn.cn

    NeuralNetworkCompactEnsembleandItsAppficafions

    WANGQinghua,,ZHANGYouyun,andZHUYongsheng

    1KeyLaboratoryforModernDesignandRotor-BearingSystemofMinistryofEducation, Xi'anJiaotongUniversity,Xi'an710049,China

    2SchoolofMechatronicEngineering,Xi'anTechnologicalUniversity,Xi'an710032,China ReceivedJanuary4,2009;revisedMarch5,2010;acceptedMarch10,2010;publishedelectronicallyMarch12,2010

    Abstract:TherehasbeenmanymethodsinconstructingneuralnetworkfNN)ensembles,wherethemethodofsimultaneoustraining

    hassucceedingeneralizationperformanceandefficiency.ButjustlikeregularmethodsofconstructingNNensembles,itfollowsthe

    twosteps,firsttrainingcomponentnetworks,andthencombiningthem.Asthetwostepsbeingindependent,anassumptionisusedto

    facilitateinteractionsamongNNsduringthetrainingstage.Thispaperpresentsacompactensemblemethodwhichintegratesthetwo

    stepsofensembleconstructionintoonestepbyattemptingtotrainindividualNNsinanensembleandweightheindividualmembers

    adaptivelyaccordingtotheirindividualperformanceinthesamelearningprocess.ThisprovidesanopportuintyfortheindividualNNs

    tointeractwitheachotherbasedontheirrealcontributionstotheensemble.TheclassificationperformanceofNNcompactensemble

    (NNCE)wasvalidatedthroughsomebenchmarkproblemsinmachinelearning,includingAustraliancreditcardassessment,pima

    Indiansdiabetes,heartdisease,breastcancerandglass.Comparedwithotherensembles,theclassificationerrorrateofNNCEcanbe

    decreasedby0.45%to68%.Inaddition,theNNCEwasappliedtofaultdiagnosisforrollingelementbearing.The11timedomain

    statisticalfeaturesareextractedasthepropertiesofdata,andtheNNCEisemployedtoclassifythedata.Withtheresultsofseveral

    experiments,thecompactensemblemethodisshowntogivegoodgeneralizationperformance.Thecompactensemblemethodcan

    recognizethedifferentfaulttypesandvailOUSfaultdegreesofthesamefaulttype. Keywords:neuralnetworkcompactensemble(NNCE),combinationweights,classificationperformance,faultdiagnosis

    1IntrOductiOn

    NeuralnetworkfNN)ensemblesarereceivingincreasing

    attentionduetotheirstrongergeneralizationabilitythana

    singleNN.Ingenera1.mostmethodsofconstructingNN

    ensemblesfollowthetwosteps,firsttrainingcomponent

    networks.andthencombiningthem.BoththeoreticalJ

    andempiricalwork'hasshownthatagoodNNensemble

    musthaveaccurateindividualnetworksandtheerrorsof

    thesenetworksmustbeondifferentpartsoftheinoutspace

    Inordertoobtainaccurateanddiversenetworks.thereare

    threewaysoftrainingmethods:independent.training,

    sequentialtraining,andsimultaneoustraining.Independent

    trainingmethodsemphasizedindependenceamong

    individualNNsbyvaryinginitialrandomweights.the

    ,. architectures.thelearningalgorithmusedandthedata

    Sequentialtrainingmethodstrainasetofnetworksina

    particularorderinordertodecorrelatetheerrorsof

individualnetworks.Bethofthemcannotinteractamong

    theindividualnetworksuntiltheintegrationstage. Simultaneoustrainingmethodstrainasetofnetworks together.NegativecorrelationlearningCL)Jisan Correspondingauthor.E-mail:wqhhuazi@gmail.com ThisprojectissupportedbyNationalNaturalScienceFoundationof China(GrantNo.50575179),andNationalHitechResearchand

    DevelopmentProgramofChina(863Program,GrantNo.2006AA04Z420) exampleofsimultaneoustrainingmethod.SinceNCLcan achievestrongergeneralizationabilitywhichovercomes thedisadvantagesofindependenttrainingandsequential training,ithasattractedmuchattention.GAVINand JEREMY[studiedtheformalbasisbehindNCLand

    revealedtheboundofpenaltyparameter.Multi.population particleswarffloptimization{~~andincrementaltrainingf~J

    arepresentedtoainnegativelycorrelatedNNensemble

    whosearchitectureofeachNNiSautomaticallyconfigured. CHANandKASAB0VLproposedanewmethodwhich

    introducessetsofcorrelationcorrecteddatarCCdata)as

    newtrainingdatatoimprovethegeneralizationabilityof NCL.AlthoughNCLmethodhassucceedinsimultaneous training.theinteractionamongindividualnetworksisbased ontheassumptionthattheensembleoutputisasimple averaging.Theassumptionisirrationalbecausenotallthe networksareequallyimportant.

    Thispaperproposesaneuralnetworkcompactensemble (NNCE,methodwhichisquitedifferentfromother ensembleapproaches.Otherapproachesusuallytrainthe individualnetworksfirstly,thenapplysomeregulationsor algorithmstomodulatecombinationweightsaccordingto

theirperformance.Thatistosay,trainingcomponent

    networksandcombiningthemaretwoindependentsteps. Sotheindividualmemberscannotinteractamongthemor onlycaninteractbasedonsomeassumptionnotonreal ?210?WANGQinghua,etal:NeuralNetworkCompactEnsembleandItsApplications

    conditions,suchasindependenttrainingandnegative correlationlearning.NNCEintegratesthetwostepsof ensembleconstructionintoonestepbyattemptingtotrain componentnetworksinanensembleandmodulatethe combinationweightsinthesamelearningprocess.Thatis, thegoalofindividualtrainingistogeneratethebestresult forthewholeensemble.NNCEkeepsthecollective decisionstrategyaccordfromtrainingstagetoworking stage.Inaddition.theinteractionsamongtheindividual NNsinanensembleisbasedonrealcontributionsof networkstoanensembleratherthantheassumptionof simpleaveragingintegrationsuchasnegativecorrelation learning.

    NNCEwasanalyzedontheirisdatasettoshowhowand whyNNCEworksandthedifferencefromotherensemble methods.Anumberofotherbenchmarkproblemsincluding Australiancreditcard,pimaIndiansdiabetes,heartdisease, breastcancerandglasswillbecarriedouttocomparewith someNNensemblealgorithins.Intermsofgeneralization abi1ity,experimentalresultshaveshowedclearlythat NNCEisbetterthanotherensemblesinalmostallcases. NNCEwasalsoappliedtofaultdiagnosisforrolling elementbearing.

    TherestofthisPaDerisorganizedasfollows.Section2 describesNNCEindetails.Section3analyzesthe

    classificationperformanceofNNCE.Section4applies NNCEtofaultdiagnosisforrollingelementbearing. Finallysection5concludesthePaperwithabriefsummary. 2NeuralNetworkCompactEnsemble

    Combinationweightsreflecttheinfluencesofindividual NNsonanensemble.Becauseindividua1membersinan ensemblearerequiredtobebothaccurateanddiverse.the influencesofindividualmembersaredifferent.Howto evaluatethecombinationweightsproperly,howtotrain individualNNsbasedontheevolvingcombinationweights, andwhatistherelationshipbetweenthemodulating combinationweightsandtrainingmembers,thesequestions couldbeansweredinNNCEalgorithm.

    Supposethatwehaveatrainingset

    D={((1),(1)),,((?),(?))},

    wherex?Rp.disascalar,andNisthesizeofthetraining set.Theassumptionthattheoutputdisascalarhasbeen mademerelytosimplifyexpositionofideaswithoutlOSSof generality.NNCEconsidersestimatingdbyformingan ensemblewhoseoutputisaweightedaveragingofoutputs ofasetofneuralnetworks.Theseweightscalled combinationweightsarechangeableinthelearningprocess. Theoutputofanensembleisdescribedas

    M

    Yj(,z)=?wi(,2),(1)

    wherew1isthecombinationweightfortheithnetworkin anensemble.Misthenumberoftheindividualneural networksintheensemble,(n)istl1efhneuronoutput ofnetworkfonthenthtrainingpattem,and)isthe outputoftheensembleonthenthtrainingpattern.

    Theaverageerrorfunctionoftheensembleisdefinedby :

    ),

    ?H

    =

    1

    =

    去善e,1C?-dj?

    where1fn1isthevalueoftheerrorfunctionofensembleat presentationofthenthtrainingpattem,ej(n)istheerror functionofthe/theelloutputofensemble.Cisthenumber ofoutputcellsoftheensemble.

    Toaddressthefirstquestion.inordertoevaluatethe combinationweightsproperlyandmakethemreflectthe tealcontributionsofmemberstotheensemble.the empirica1riskfunctionoftheensembleisusedtomodulate thecombinationweightsbycalculatingthefollowing constraintoptimizationinthelearningprocess: min=

    =

    S.t.1?Wt?0.wf:

    .

    (4)

    Hereweadoptgradientdescentalgorithmtominimize theriskfunctionoftheensemble:

    ?wi:O&

    v

    aw

    ~(n)-dj(n)oy)}

    Eq.(5)canbeconvertedintothefollowingform:

?:wCNl

    NC

    ?

    ]}??i()?()]}n=1,=1l

    (5)

    (6)

    InEq.(6),y(n)?(")reflectsthecorrelationof posteriorprobabilityestimationbetweentheNNandkth

    NN.d(,z)?Y(,z)isthecorrelationbetweentheposterior probabilityestimationofthefchNNandtherealposterior

    probability.whichreflectstheaccuracyoftheffhNN.So wecansaythattheupdatingcombinationweightsisto improvetheaccuracyoftheNNandtodecreasethe ,L

    ,

    

    ,

    ,L

    ?

    ???

    ??

    ?

    ??一I

    

    ?j

    ,J

    /L

    f,

    J

    ,L

.,

    ......L?

    ??

    CHINESEJ0URNALOFMECHANICALENGINEERING

    correlationwithotherNNs.

    Eq.(5)isalsodescribedas

    =

    CNn=lj=l

    ??.(7)

    Eq.(7)indicatesthatupdatingcombinationweightsare relatedtotheensembleerror.Itshouldbenotedthat.in eachiteration,wmustbenormalizedtokeepthesumof allcombinationsweightsone.

    Thetunableparametersofindividualnetworksthattake theformoftheinnerconnectionweightswereupdatedby minimizingtheaverageerrorfunctionoftheensemble.For theithindividualneuralnetworkintheensemble.the connectionweightsfromOHtputlayertohiddenlayer(wt) wereupdatedbycalculatingthepartialderivativeofEa withrespecttotheconnectionweights.Thentheerroris propagatedbackwardstotheweightsfrominputlayerto hiddenlayer(w1):

    ?,

    NC=

    .,

    ?=_NCw,?t.

    Letusreviewbackpropagation(BP)algorithm.Inorder tokeepthecoincidenceofsymbolexpression,superscripti isusedtodistinguishindividualnetworksfromensemble. Theaverageerrorfunctionofanindividualnetworkis

definedasfollows:

    :

    E),

    V

    :

    1

    =

    1c=1C.],

    (10)

    whereEw

    iistheaverageerrorfunctionofnetwork

    ,E()

    isthevalueoftheerrorfunctionofnetworkatpresentation ofthenthtrainingpaRem,P(,z)istheerrorfunctionofthe jthcelloutputofnetwork.Theconnectionweightsof networkandareupdatedasfollowingequations: ?=--17

    =NC?2,

    ?=

    NC

    .

    ByComparingandcontrastingEqs.(8)and(9)toEqs. (12)and(13)respectively,wecanseethatintheexpression of?wandAinNNcE,thedotproductofensemble

    error(e(,z))andcombinationweight(w)ofthenetwork faresubstitutedforthenetworkerror(e()).Itshowsthat NNCEtrainsindividualnetworksbyminimizingthe ensemblegeneralizationerror,nottheindividualerror.That is,NNCEdoesnotputemphasisonthejndividua1accurate. butontheensembleaccurate.

    AsshowninEqs.(8)and(9),theconnectionweights updatingofindividualnetworksarecloselyrelatedto combinationweightswhichreflecttheirtrainingstatus. Thisindicatesthattheindividualtrainingisbasedonthe evolvingcombinationweights.Combinationweightsreflect thenetworks'influencesontheensemble.sowesaythat NNCEtrainsindividualnetworksbasedontheirown contributionstotheensemble.Thesigna1flowchartof NNCEissummarizedinFig.1.

    InputneuronsOutputneurons

    Fig.1.SignalflowchartofNNCE

    FromFig.1andtheabovedescriptionaboutNNCE algorithm.CEdiffersfrompreviousworkindesigning andtrainingNNensemblesinanumberofways.

    (1)Traincomponentnetworksbasedonitsreal

    contributionstotheensemble,notontheassumptionofthe simplesame.

    (2)Adjusttheinnerparametersofindividualnetworks inanensembleandthecombinationweightsinthesame learningprocess.

    f3)Theinteractionofindividualnetworksinthe ensembleisconveyedviatheevolvingcombination weightsintheerrorfunction.

    (4)Theimplementprocessofthisalgorithmiscompact byintegratingthetrainingstepandcombiningstepintoone step.

    ?

    212?WANGQinghua,etal:NeuralNetworkCompactEnsembleandItsApplications

    3ClassificationPerformanceofNNCE

    ThissectionanalysesNNCEontheirisdatasettoshow

Report this document

For any questions or suggestions please email
cust-service@docsford.com