Abstract
Producing truly realistic Virtual Reality (VR) experiences is a process often hindered by the constraints and limited capabilities of currently available consumer-grade VR headsets. These limitations include the lack of dynamic range, reduced color gamut, and lower bit-depth of Standard Dynamic Range (SDR) VR displays, which can result in a degraded immersive experience and sense of presence. To overcome hardware limitations, various techniques have been developed to observe High Dynamic Range (HDR) imagery on SDR displays, aimed at simulating the human eye's adaptation to varying lighting conditions. These techniques include tone-mapping and exposure adjustments. This paper proposes a graphic engine-based solution for eye-tracked VR headsets, where panoramic stereoscopic images are captured with a consumer omnidirectional camera system. Our method leverages the headset's capability to collect the user's head direction and eye gaze during observations. Three approaches are evaluated, specifically targeting stereoscopic fully spherical panoramic photography within VR headsets: Static HDR, which involves displaying tone-mapped images from HDR panoramas; Eye-adapted HDR with head tracking, which adjusts the luminance range shown on the SDR display based on the user's head rotation; and Eye-adapted HDR with eye tracking, which dynamically changes the luminance range shown on the SDR display based on the user's gaze position on the screen. Both indoor and outdoor scenes are assessed. The outcomes from our usability evaluation provide valuable insights into how eye-adapted HDR techniques can enhance the sense of presence by better simulating the human eye's adaptation to different lighting conditions. Our assessment includes effects on emotions, visual realism, and depth perception, showing clear trends and directions, particularly when using eye-tracking and real-time eye-adaptation in outdoor environments. These are desired features when designing immersive photography-...
Original language | English |
---|---|
Title of host publication | 2024 IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence and Neural Engineering (MetroXRAINE) |
Place of Publication | St Albans, UK |
Publisher | Institute of Electrical and Electronics Engineers (IEEE) |
Pages | 924-929 |
Number of pages | 6 |
ISBN (Electronic) | 979-8-3503-7800-9 |
ISBN (Print) | 979-8-3503-7799-6 |
DOIs | |
Publication status | Published - 24 Dec 2024 |
Event | 2024 IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence and Neural Engineering - IEEE MetroXRAINE - The Alban Arena, St Albans, United Kingdom Duration: 21 Oct 2024 → 23 Oct 2024 https://metroxraine.org/index |
Conference
Conference | 2024 IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence and Neural Engineering - IEEE MetroXRAINE |
---|---|
Abbreviated title | EEE MetroXRAINE 2024 |
Country/Territory | United Kingdom |
City | St Albans |
Period | 21/10/24 → 23/10/24 |
Internet address |