An immersive experience is typically connected to user-scene interaction. The system needs to sense the user's movements and inputs and trigger actions accordingly. When operating on PC and console games, interaction typically occurs through hand controllers, including joysticks. Compared to a mouse and keyboard, joysticks allow for articulated command actions, letting users move objects and viewpoints. When wearing a VR headset, the user's head position is tracked, triggering viewpoint changes. This is a major difference between operating on VR headsets and desktop monitors. Changing observation viewpoint through head-rotation comes naturally, which adds to the sensation already given by a VR headset of isolation from the surrounding world. Both sensations contribute to the increased user immersion in the observed world. However, when overviewing a relatively-complex dynamic scene, changing the viewpoint by head-rotation may not be most suitable. Instead, a change triggered by the controller's joystick may appear more convenient and comfortable. This paper assesses the use of head-rotation and controller joystick to generate differences in observation viewpoint. Our application context is a three-dimensional dynamic-scene viewing within command and control operations, which is typical in military defense. Our assessment includes tasks of identification, discovery and positioning of drone movements. We measure objective user's performance by examining mission success and timing and subjective factors such as ease of use, presence, comfort and depth perception.