Saliency & Visual Search in VR
This VR eye-tracking study tested whether a salient target glimpsed outside the initial field of view speeds visual search once it enters view. Participants searched for targets while either seeing them upfront, needing a head turn to bring them into view, or revealing the periphery only after the head turn. Salient items were found faster only when already within the initial FoV; when a head movement was required, search proceeded as if no peripheral cue was available, with time gains arising only after the target set was reached. The result: classic saliency effects from fixed displays don’t fully generalise to active, head-driven search in VR – useful for interface layout and task design.