Image super-resolution performance of multilayer feedforward neural networks
dc.contributor.advisor | Hunt, Bobby R. | en_US |
dc.contributor.author | Davila, Carlos Antonio | |
dc.creator | Davila, Carlos Antonio | en_US |
dc.date.accessioned | 2013-04-25T10:15:33Z | |
dc.date.available | 2013-04-25T10:15:33Z | |
dc.date.issued | 1999 | en_US |
dc.identifier.uri | http://hdl.handle.net/10150/284549 | |
dc.description.abstract | Super-resolution is the process by which the bandwidth of a diffraction-limited spectrum is extended beyond the optical passband. Many algorithms exist which are capable of super-resolution; however most are iterative methods, which are ill-suited for real-time operation. One approach that has been virtually ignored in super-resolution research is the neural network approach. The Hopfield network has been a popular choice in image restoration applications, however it is also an iterative approach. We consider the feedforward architecture known as a Multilayer Perceptron (MLP), and present results on simulated binary and greyscale images blurred by a diffraction-limited OTF and sampled at the Nyquist rate. To avoid aliasing, the network performs as a nonlinear spatial interpolator while simultaneously extrapolating in the frequency domain. Additionally, a novel use of vector quantization for the generation of training data sets is presented. This is accomplished by training a nonlinear vector quantizer (NLIVQ), whose codebooks are subsequently used in the supervised training of the MLP network using Back-Propagation. The network shows good regularization in the presence of noise. | |
dc.language.iso | en_US | en_US |
dc.publisher | The University of Arizona. | en_US |
dc.rights | Copyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author. | en_US |
dc.subject | Engineering, Electronics and Electrical. | en_US |
dc.subject | Physics, Optics. | en_US |
dc.title | Image super-resolution performance of multilayer feedforward neural networks | en_US |
dc.type | text | en_US |
dc.type | Dissertation-Reproduction (electronic) | en_US |
thesis.degree.grantor | University of Arizona | en_US |
thesis.degree.level | doctoral | en_US |
dc.identifier.proquest | 9934855 | en_US |
thesis.degree.discipline | Graduate College | en_US |
thesis.degree.discipline | Electrical and Computer Engineering | en_US |
thesis.degree.name | Ph.D. | en_US |
dc.description.note | This item was digitized from a paper original and/or a microfilm copy. If you need higher-resolution images for any content in this item, please contact us at repository@u.library.arizona.edu. | |
dc.identifier.bibrecord | .b39652245 | en_US |
dc.description.admin-note | Original file replaced with corrected file September 2023. | |
refterms.dateFOA | 2018-07-03T00:26:00Z | |
html.description.abstract | Super-resolution is the process by which the bandwidth of a diffraction-limited spectrum is extended beyond the optical passband. Many algorithms exist which are capable of super-resolution; however most are iterative methods, which are ill-suited for real-time operation. One approach that has been virtually ignored in super-resolution research is the neural network approach. The Hopfield network has been a popular choice in image restoration applications, however it is also an iterative approach. We consider the feedforward architecture known as a Multilayer Perceptron (MLP), and present results on simulated binary and greyscale images blurred by a diffraction-limited OTF and sampled at the Nyquist rate. To avoid aliasing, the network performs as a nonlinear spatial interpolator while simultaneously extrapolating in the frequency domain. Additionally, a novel use of vector quantization for the generation of training data sets is presented. This is accomplished by training a nonlinear vector quantizer (NLIVQ), whose codebooks are subsequently used in the supervised training of the MLP network using Back-Propagation. The network shows good regularization in the presence of noise. |