From visual appearance to material categories
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Materials from distinct categories share common image features but also exhibit unique characteristics. How do humans recognize materials despite their enormous appearance variations? Using generative neural networks, we create both prototypical and ambiguous materials that morph between categories. To capture the richness of material representation, we characterize how people judge materials using cross-material morphs through three behavioral tasks: material categorization, material property rating, and visual similarity judgment. We find that morphing smoothly modulates perceptual scales of material appearance. Despite this smooth variation, participants can reliably identify the prototypical materials within given appearance ranges, indicating a strong association between material categories and their visual characteristics. Material properties structure the perceptual space, where the salient dimensions--particularly rigidity--strongly correlate with perceived material categories. In contrast, analyzing image embedding derived from a self-supervised deep learning model reveals that learning visual similarities alone is insufficient to reproduce human perceptual space of materials. Together, our results suggest that material representation is more than visual similarities and may require learning material properties that structure relationships among materials within and across categories. Visual features might be used flexibly and task-dependently to support multiple levels of material perception, ranging from discrimination to material property inference to categorization.