9StatisticsandNeuralNetworks 9.1Linearandnonlinearregression Feed-forwardnetworksareusedtofindthebestfunctionalfitforasetofinput-outputexamples.Changestothenetworkweightsallowfine-tuningofthenetworkfunctioninordertodetecttheoptimalconfiguration.However,twocomplementarymotivationsdetermineourperceptionofwhatoptimalmeansinthiscontext.Ontheonehandweexpectthenetworktomaptheknowninputsasexactlyaspossibletotheknownoutputs.Butontheotherhandthenetworkmustbecapableof generalizing ,thatis,unknowninputsaretobecomparedtotheknownonesandtheoutputproducedisakindofinterpolationoflearnedvalues.However,goodgeneralizationandminimalreproductionerrorofthelearnedinput-outputpairscanbecomecontradictoryobjectives. 9.1.1Theproblemofgoodgeneralization Figure9.1showstheproblemfromanotherperspective.Thedotsinthegraphicrepresentthetrainingset.Wearelookingforafunctioncapableofmappingtheknowninputsintotheknownoutputs.Iflinearapproximationisused,asinthefigure,theerrorisnotexcessiveandnewunknownvaluesoftheinput x aremappedtotheregressionline.Figure9.2showsanotherkindoffunctionalapproximationusinglinearsplineswhichcanreproducethetrainingsetwithouterror.However,whenthetrainingsetconsistsofexperimentalpoints,normallythereissomenoiseinthedata.Reproducingthetrainingsetexactlyisnotthebeststrategy,becausethenoisewillalsobereproduced.AlinearapproximationasinFigure9.1couldbeabetteralternativethantheexactfitofthetrainingdatashowninFigure9.2.Thissimpleexampleillustratesthetwocontradictoryobjectivesoffunctionalapproximation:minimizationofthetrainingerrorbutalsominimizationoftheerrorofyetunknowninputs.Whetherornotthetrainingsetcanbe R. Rojas: Neural Networks, Springer-Verlag, Berlin, 1996 R. Rojas: Neural Networks, Springer-Verlag, Berlin, 1996
GMT+8, 2023-1-27 13:11 , Processed in 9.344285 second(s), 30 queries .
打分:
0 星