Gloss discrimination: Towards an image-based perceptual model

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Gloss is typically considered the perceptual counterpart of a surface’s specular reflectance characteristics, much as color is the perceptual counterpart of a surface’s diffuse reflectance spectrum. In many contexts, it is tempting to ask how discriminable two surfaces are on the basis of their reflectance properties. Yet, as we argue here, this is a poorly-posed question, as factors other than reflectance (e.g., lighting, shape, viewpoint) can have substantial effects on how discriminable two images of glossy surfaces are to human participants. This fundamental difficulty with predicting gloss discrimination, whether from a physical measurement or from proximal image data, has so far hobbled efforts to establish a rigorously defined perceptual standard for surface gloss, similar to those that exist for color. Here, we propose an experimental framework for making this problem tractable, starting from the premise that any perceptual standard of gloss discrimination must account for how distal scene variables influence the statistics of proximal image data. With this goal in mind, we rendered a large set of images in which shape, illumination, viewpoint, and surface roughness were varied. For each combination of viewing conditions, a fixed difference in surface roughness was used to create a pair of images showing the same object (from the same viewpoint and under the same lighting) with high and low gloss. Human participants (N=150) completed a paired comparisons task in which they were required to select image pairs with the largest apparent gloss difference. Importantly, rankings of the scenes derived from these judgments represent differences in perceived gloss independent of physical reflectance. We find that these rankings are remarkably consistent across participants, and are well predicted by a straightforward Visual Differences Predictor (Daly, 1992; Mantiuk et al., 2023). This allows us to estimate reasonable bounds on visual discriminability for a given surface across a wide range of viewing conditions. This has potential applications in both vision science, computer graphics and industrial contexts.

Article activity feed