Co-axial depth sensor with an extended depth range for AR/VR applications
Name:
1-s2.0-S2096579620300012-main.pdf
Size:
4.377Mb
Format:
PDF
Description:
Final Published Version
Affiliation
Visualization and Imaging Systems Laboratory, College of Optical Sciences, University of ArizonaIssue Date
2020
Metadata
Show full item recordPublisher
KeAi Communications Co.Citation
Xu, M., & Hua, H. (2020). Co-axial depth sensor with an extended depth range for AR/VR applications. Virtual Reality and Intelligent Hardware, 2(1), 1–11.Rights
Copyright © Beijjing zhongke Journal Publishing Co. Ltd. This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.Collection Information
This item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at repository@u.library.arizona.edu.Abstract
Background: Depth sensor is an essential element in virtual and augmented reality devices to digitalize users' environment in real time. The current popular technologies include the stereo, structured light, and Time-of-Flight (ToF). The stereo and structured light method require a baseline separation between multiple sensors for depth sensing, and both suffer from a limited measurement range. The ToF depth sensors have the largest depth range but the lowest depth map resolution. To overcome these problems, we propose a co-axial depth map sensor which is potentially more compact and cost-effective than conventional structured light depth cameras. Meanwhile, it can extend the depth range while maintaining a high depth map resolution. Also, it provides a high-resolution 2D image along with the 3D depth map. Methods: This depth sensor is constructed with a projection path and an imaging path. Those two paths are combined by a beamsplitter for a co-axial design. In the projection path, a cylindrical lens is inserted to add extra power in one direction which creates an astigmatic pattern. For depth measurement, the astigmatic pattern is projected onto the test scene, and then the depth information can be calculated from the contrast change of the reflected pattern image in two orthogonal directions. To extend the depth measurement range, we use an electronically focus tunable lens at the system stop and tune the power to implement an extended depth range without compromising depth resolution. Results: In the depth measurement simulation, we project a resolution target onto a white screen which is moving along the optical axis and then tune the focus tunable lens power for three depth measurement subranges, namely, near, middle and far. In each sub-range, as the test screen moves away from the depth sensor, the horizontal contrast keeps increasing while the vertical contrast keeps decreasing in the reflected image. Therefore, the depth information can be obtained by computing the contrast ratio between features in orthogonal directions. Conclusions: The proposed depth map sensor could implement depth measurement for an extended depth range with a co-axial design. © 2019 Beijing Zhongke Journal Publishing Co. LtdNote
Open access journalISSN
2096-5796Version
Final published versionae974a485f413a2113503eed53cd6c53
10.1016/j.vrih.2019.10.004
Scopus Count
Collections
Except where otherwise noted, this item's license is described as Copyright © Beijjing zhongke Journal Publishing Co. Ltd. This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.