Eye-Adapted HDR Viewing of Stereoscopic Panoramic Photography in Virtual Reality

A. Regalbuto, S. Livatino

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Producing truly realistic Virtual Reality (VR) experiences is a process often hindered by the constraints and limited capabilities of currently available consumer-grade VR headsets. These limitations include the lack of dynamic range, reduced color gamut, and lower bit-depth of Standard Dynamic Range (SDR) VR displays, which can result in a degraded immersive experience and sense of presence. To overcome hardware limitations, various techniques have been developed to observe High Dynamic Range (HDR) imagery on SDR displays, aimed at simulating the human eye's adaptation to varying lighting conditions. These techniques include tone-mapping and exposure adjustments. This paper proposes a graphic engine-based solution for eye-tracked VR headsets, where panoramic stereoscopic images are captured with a consumer omnidirectional camera system. Our method leverages the headset's capability to collect the user's head direction and eye gaze during observations. Three approaches are evaluated, specifically targeting stereoscopic fully spherical panoramic photography within VR headsets: Static HDR, which involves displaying tone-mapped images from HDR panoramas; Eye-adapted HDR with head tracking, which adjusts the luminance range shown on the SDR display based on the user's head rotation; and Eye-adapted HDR with eye tracking, which dynamically changes the luminance range shown on the SDR display based on the user's gaze position on the screen. Both indoor and outdoor scenes are assessed. The outcomes from our usability evaluation provide valuable insights into how eye-adapted HDR techniques can enhance the sense of presence by better simulating the human eye's adaptation to different lighting conditions. Our assessment includes effects on emotions, visual realism, and depth perception, showing clear trends and directions, particularly when using eye-tracking and real-time eye-adaptation in outdoor environments. These are desired features when designing immersive photography-...
Original languageEnglish
Title of host publication2024 IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence and Neural Engineering (MetroXRAINE)
Place of PublicationSt Albans, UK
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages924-929
Number of pages6
ISBN (Electronic)979-8-3503-7800-9
ISBN (Print)979-8-3503-7799-6
DOIs
Publication statusPublished - 24 Dec 2024
Event2024 IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence and Neural Engineering - IEEE MetroXRAINE - The Alban Arena, St Albans, United Kingdom
Duration: 21 Oct 202423 Oct 2024
https://metroxraine.org/index

Conference

Conference2024 IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence and Neural Engineering - IEEE MetroXRAINE
Abbreviated titleEEE MetroXRAINE 2024
Country/TerritoryUnited Kingdom
CitySt Albans
Period21/10/2423/10/24
Internet address

Fingerprint

Dive into the research topics of 'Eye-Adapted HDR Viewing of Stereoscopic Panoramic Photography in Virtual Reality'. Together they form a unique fingerprint.

Cite this