This Is AuburnElectronic Theses and Dissertations

Show simple item record

Utilizing Dual Neural Networks as a Tool for Training, Optimization, and Architecture Conversion


Metadata FieldValueLanguage
dc.contributor.advisorWilamowski, Bogdan
dc.contributor.advisorBaginski, Michael
dc.contributor.advisorVodyanoy, Vitaly
dc.contributor.advisorLall, Pradeep
dc.contributor.authorHunter, David Shawn
dc.date.accessioned2013-04-23T13:46:10Z
dc.date.available2013-04-23T13:46:10Z
dc.date.issued2013-04-23
dc.identifier.urihttp://hdl.handle.net/10415/3596
dc.description.abstractVery little time has been devoted to the application of Dual Neural Networks and advances that they might produce by utilizing them for conversion between network architectures. By leveraging the efficiencies of the various networks, one can begin to draw some conclusions about the unleashed power of network conversion. If we could harness the advantages of multiple network architectures and somehow combine them into one network, we could make great advances in ANNs. By introducing the DNN as a tool for training, optimization, and architecture conversion, we find that this newly presented architecture is key to unlocking the strengths of other network architectures. Results in this study show that DNN networks have significantly higher overall success rates compared to BMLP and MLP networks. In fact, the DNN architecture had either the highest or the second highest success rate in all experiments. With the conversion methods presented in this study, not only do we now have a path for network conversion between BMLP, DNN, and MLP architectures, but also a means for training networks that were previously untrainable.en_US
dc.rightsEMBARGO_NOT_AUBURNen_US
dc.subjectElectrical Engineeringen_US
dc.titleUtilizing Dual Neural Networks as a Tool for Training, Optimization, and Architecture Conversionen_US
dc.typedissertationen_US
dc.embargo.lengthNO_RESTRICTIONen_US
dc.embargo.statusNOT_EMBARGOEDen_US

Files in this item

Show simple item record