Stereoscopic vergence control and horizontal tracking using biologically inspired filters
dc.contributor.advisor | Higgins, Charles M. | en_US |
dc.contributor.author | Schwager, Michael Anthony | |
dc.creator | Schwager, Michael Anthony | en_US |
dc.date.accessioned | 2013-04-03T13:35:16Z | |
dc.date.available | 2013-04-03T13:35:16Z | |
dc.date.issued | 2000 | en_US |
dc.identifier.uri | http://hdl.handle.net/10150/278745 | |
dc.description.abstract | One of the requirements of enabling a robot to see in 3D is to move its gaze to match the target. Vergence is the disconjugate horizontal rotation of the cameras to move their gaze over the target. Tracking is the conjugate rotation. The difference in the two images captured by stereoscopic cameras (disparity), is a sufficient measure to accomplish both of these tasks. We reviewed studies of how cat visual cortex measures disparity, combined this disparity-energy model with neurophysiological models of vergence control, and developed a system which also controls horizontal tracking. Experiments confirm the operation of the system with software and inexpensive custom hardware. An architecture is presented for the implementation of this project in analog VLSI hardware, and will show a high degree of parallelism, low power consumption, real-time operation, flexibility and scalability. We discuss how to compare this vision system with others. | |
dc.language.iso | en_US | en_US |
dc.publisher | The University of Arizona. | en_US |
dc.rights | Copyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author. | en_US |
dc.subject | Engineering, Electronics and Electrical. | en_US |
dc.subject | Computer Science. | en_US |
dc.title | Stereoscopic vergence control and horizontal tracking using biologically inspired filters | en_US |
dc.type | text | en_US |
dc.type | Thesis-Reproduction (electronic) | en_US |
thesis.degree.grantor | University of Arizona | en_US |
thesis.degree.level | masters | en_US |
dc.identifier.proquest | 1402029 | en_US |
thesis.degree.discipline | Graduate College | en_US |
thesis.degree.discipline | Electrical and Computer Engineering | en_US |
thesis.degree.name | M.S. | en_US |
dc.identifier.bibrecord | .b41166152 | en_US |
refterms.dateFOA | 2018-09-04T05:36:51Z | |
html.description.abstract | One of the requirements of enabling a robot to see in 3D is to move its gaze to match the target. Vergence is the disconjugate horizontal rotation of the cameras to move their gaze over the target. Tracking is the conjugate rotation. The difference in the two images captured by stereoscopic cameras (disparity), is a sufficient measure to accomplish both of these tasks. We reviewed studies of how cat visual cortex measures disparity, combined this disparity-energy model with neurophysiological models of vergence control, and developed a system which also controls horizontal tracking. Experiments confirm the operation of the system with software and inexpensive custom hardware. An architecture is presented for the implementation of this project in analog VLSI hardware, and will show a high degree of parallelism, low power consumption, real-time operation, flexibility and scalability. We discuss how to compare this vision system with others. |