A multiresolution color model for visual difference prediction

  • Tolhurst D
  • Ripamonti C
  • Párraga C
 et al. 
  • 17


    Mendeley users who have this article in their library.
  • 10


    Citations of this article.


How different are two images when viewed by a human observer? Such knowledge is needed in many situations including when one has to judge the degree to which a graphics representation may be similar to a high-quality photograph of the original scene. There is a class of computational models which attempt to predict such perceived differences. These are derived from theoretical considerations of human vision and are mostly validated from experiments on stimuli such as sinusoidal gratings. We are developing a model of visual difference prediction based on multi-scale analysis of local contrast, to be tested with psychophysical discrimination experiments on natural-scene stimuli. Here, we extend our model to account for differences in the chromatic domain. We describe the model, how it has been derived and how we attempt to validate it psychophysically for monochrome and chromatic images. Copyright © 2005 by the Association for Computing Machinery, Inc.

Author-supplied keywords

  • c alejandro párraga 2
  • color vision
  • image difference
  • metrics
  • p george lovell 3
  • psychophysical testing
  • tom

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document


  • David J Tolhurst

  • Caterina Ripamonti

  • C. Alejandro Párraga

  • P. George Lovell

  • Tom Troscianko

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free