Computational Model of Stereoscopic 3D Visual Saliency

  • Wang J
  • Da Silva M
  • Le Callet P
 et al. 
  • 1


    Mendeley users who have this article in their library.
  • N/A


    Citations of this article.


Many computational models of visual attention performing well in predicting salient areas of 2D images have been proposed in the literature. The emerging applications of stereoscopic 3D display bring an additional depth of information affecting the human viewing behavior, and require extensions of the efforts made in 2D visual modeling. In this paper, we propose a new computational model of visual attention for stereoscopic 3D still images. Apart from detecting salient areas based on 2D visual features, the proposed model takes depth as an additional visual dimension. The measure of depth saliency is derived from the eye movement data obtained from an eye-tracking experiment using synthetic stimuli. Two different ways of integrating depth information in the modeling of 3D visual attention are then proposed and examined. For the performance evaluation of 3D visual attention models, we have created an eye-tracking database, which contains stereoscopic images of natural content and is publicly available, along with this paper. The proposed model gives a good performance, compared to that of state-of-the-art 2D models on 2D images. The results also suggest that a better performance is obtained when depth information is taken into account through the creation of a depth saliency map, rather than when it is integrated by a weighting method.

Author-supplied keywords

  • 3d-tv
  • attention
  • bottom-up
  • depth
  • depth saliency
  • eye-tracking
  • objects
  • saliency map
  • search
  • stereoscopy
  • top-down
  • video
  • visual attention

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in


  • J L Wang

  • M P Da Silva

  • P Le Callet

  • V Ricordel

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free