Digital Phase Conjugation and Real-Time 3D Object Generation with Mems Phase Light Modulator
Publisher
The University of Arizona.Rights
Copyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction, presentation (such as public display or performance) of protected items is prohibited except with permission of the author.Abstract
An ideal Near-to-Eye display (NED) requires high-resolution images, a large field of view (FOV) and depth cues. Sometimes, those performances are degraded due to optical aberrations of optics. In the first project, to correct for aberrations, digital phase conjugation (DPC) was demonstrated with a Texas Instruments phase light modulator (TI-PLM) for the first time to generate a tightly focused point image over an aberrated TIR/geometrical image guide. We measured aberration of the image guide combiner by off-axis holography that captures the off-axis fringes using a camera sensor. Subsequently, image processing on the captured fringes, involving Fourier transform and cropping of +1st order, to extract the final field information while reducing low-frequency noise. Computer-Generated Hologram (CGH) was generated to negate the phase aberration, which is then displayed on the PLM. Through phase conjugation, the spherical and un-aberrated wavefront was reconstructed over an aberrated optical medium, resulting in a series of point sources displayed at different depths, and producing a 3D point images. This method can be utilized to generate multiple point sources with different depths, contributing to the 3D image in Near-to-Eye display even via aberrated medium. As a second part of the thesis, a real time pipe line towards holographic 3D head up display (HUD) for Advanced Driver Assistance System (ADAS) was pursued. A holographic 3D HUD provides an integrated environment for vehicle drivers by sensing the situation around the vehicle, by extracting significant and critical information from the sensor data, such as traffic sign, preceding traffics, and even invisible obstacles, and notifying the driver. To achieve the pipeline for a 3D holographic Head-Up Display (HUD) we addressed three primary challenges. Firstly, image processing for road lane recognition was accomplished using OpenCV from sensor inputs. Then, to ensure a latency-free display, CGH was calculated using GPU in CUDA mode. Lastly, based on the detected road lanes, the optical overlay of the 3D image and road lane was achieved through a first-order optical model. This approach allows the 3D image to be displayed over the road lane at the same size and depth, providing the driver with direction guidance based on navigation data. The speed of CGH calculations was compared across three platforms: CPU-based, GPU-based, and cloud-based computing. The development of a 3D holographic Display using TI-PLM addresses significant challenges in NED and ADAS. By leveraging DPC and CGH, the thesis demonstrates the ability to correct optical aberrations and produce high-resolution, depth-cued 3D images. The pipeline developed for a holographic HUD integrates real-time sensor data, ensuring latency-free display and accurate overlay of road lane. Through GPU-based processing, the project achieves efficient CGH generation, overcoming computational challenges associated with complex 3D objects. The application of this technology in the automotive industry holds potential for enhancing driver safety and navigation by providing immediate and intuitive visual information. Overall, the integration of real-time CGH with HUD systems promises to revolutionize the conveyance of critical information, leading to safer and more efficient operations.Type
Electronic Thesistext
Degree Name
M.S.Degree Level
mastersDegree Program
Graduate CollegeOptical Sciences