Utilizing Dual Neural Networks as a Tool for Training, Optimization, and Architecture Conversion
Hunter, David Shawn
Type of Degreedissertation
MetadataShow full item record
Very little time has been devoted to the application of Dual Neural Networks and advances that they might produce by utilizing them for conversion between network architectures. By leveraging the efficiencies of the various networks, one can begin to draw some conclusions about the unleashed power of network conversion. If we could harness the advantages of multiple network architectures and somehow combine them into one network, we could make great advances in ANNs. By introducing the DNN as a tool for training, optimization, and architecture conversion, we find that this newly presented architecture is key to unlocking the strengths of other network architectures. Results in this study show that DNN networks have significantly higher overall success rates compared to BMLP and MLP networks. In fact, the DNN architecture had either the highest or the second highest success rate in all experiments. With the conversion methods presented in this study, not only do we now have a path for network conversion between BMLP, DNN, and MLP architectures, but also a means for training networks that were previously untrainable.
- David Hunter - Dissertation - 20130410 - Final.pdf