Retinal motion statistics during natural locomotion

Curation statements for this article:
  • Curated by eLife

    eLife logo

    eLife assessment

    This important study should be of interest to vision scientists and those seeking to model naturalistic image processing for humans in simulated or real navigational [walking] situations. The experiments aim to provide information about the statistics of "retinal" motion patterns generated by human participants physically walking a straight path in real terrains that differ in "smoothness". State-of-the-art eye, head, and body tracking allowed simultaneous assessment of eye movements, head movements, and gait, with convincing evidence for an asymmetrical gradient of flow speeds during walking, tied predominantly to vertical gaze angle, together with a radial motion direction distribution tied most critically on horizontal gaze angle. While not a major weakness per se, additional details on analytical methods used and estimations of variance across observers would strengthen these results and clarify the basis of the global claims made about visual motion information across the visual field in walking humans.

This article has been Reviewed by the following groups

Read the full article See related articles

Abstract

Walking through an environment generates retinal motion, which humans rely on to perform a variety of visual tasks. Retinal motion patterns are determined by an interconnected set of factors, including gaze location, gaze stabilization, the structure of the environment, and the walker’s goals. The characteristics of these motion signals have important consequences for neural organization and behavior. However, to date, there are no empirical in situ measurements of how combined eye and body movements interact with real 3D environments to shape the statistics of retinal motion signals. Here, we collect measurements of the eyes, the body, and the 3D environment during locomotion. We describe properties of the resulting retinal motion patterns. We explain how these patterns are shaped by gaze location in the world, as well as by behavior, and how they may provide a template for the way motion sensitivity and receptive field properties vary across the visual field.

Article activity feed

  1. Author Response:

    What is novel here is that we calculated the time-varying retinal motion patterns generated during the gait cycle using a 3D reconstruction of the terrain. This allows calculation of the actual statistics of retinal motion experienced by walkers over a broad range of normal experience. We certainly do not mean to claim that stabilizing gaze is novel, and agree that the general patterns follow directly from the geometry as worked out very elegantly by Koenderink and others. We spend time describing the terrain-linked gaze behavior because it is essential for understanding the paper. We do not claim that the basic saccade/stabilize/saccade behavior is novel and now make this clearer.

    The other novel aspect is that the motion patterns vary with gaze location which in turn varies with terrain in a way that depends on behavioral goals. So while some aspects of the general patterns are not unexpected, the quantitative values depend on the statistics of the behavior. The actual statistics require these in situ measurements, and this has not previously been done, as stated in the abstract.

    The measured statistics provide a well-defined set of hypotheses about the pattern of direction and speed tuning across the visual field in humans. Points of comparison in the existing literature are hard to find because the stimuli have not been closely matched to actual retinal flow patterns, and the statistics will vary with the species in question. However, recent advances allow for neurophysiological measurements and eye tracking during experiments with head-fixed running, head-free, and freely moving animals. These emerging paradigms will allow the study of retinal optic flow processing in contexts that do not require simulated locomotion. While the exact the relation between the retinal motion statistics we have measured and the response properties of motion-sensitive cells remains unresolved, the emerging tools in neurophysiology and computation make similar approaches with different species more feasible.

    A more detailed description of the methods including the photogrammetry and the reference frames for the measurements has been added primarily to the Methods section.

    Reviewer #1 (Public Review):

    Much experimental work on understanding how the visual system processes optic flow during navigation has involved the use of artificial visual stimuli that do not recapitulate the complexity of optic flow patterns generated by actual walking through a natural environment. The paper by Muller and colleagues aims to carefully document "retinal" optic flow patterns generated by human participants walking a straight path in real terrains that differ in "smoothness". By doing so, they gain unique insights into an aspect of natural behavior that should move the field forward and allow for the development of new, more principled, computational models that may better explain the visual processing taking place during walking in humans.

    Strengths:

    Appropriate, state-of-the-art technology was used to obtain a simultaneous assessment of eye movements, head movements, and gait, together with an analysis of the scene, so as to estimate retinal motion maps across the central 90 deg of the visual field. This allowed the team to show that walkers stabilize gaze, causing low velocities to be concentrated around the fovea and faster velocities at the visual periphery (albeit more the periphery of the camera used than the actual visual field). The study concluded that the pattern of optic flow observed around the visual field was most likely related to the translation of the eye and body in space, and the rotations and counter-rotations this entailed to maintain stability. The authors were able to specify what aspects of the retinal motion flow pattern were impacted by terrain roughness, and why (concentration of gaze closer to the body, to control foot placement), and to differentiate this from the impact of lateral eye movements. They were also able to identify generalizable aspects of the pattern of retinal flow across terrains by subsampling identical behaviors in different conditions.

    Weaknesses:

    While the study has much to commend, it could benefit from additional methodological information about the computations performed to generate the data shown. In addition, an estimation of inter-individual variability, and the role of sex, age, and optical correction would increase our understanding of factors that could impact these results, thus providing a clearer estimate of how generalizable they are outside the confines of the present experiments.

    Properties of gait depend on the passive dynamics of the body and factors such as leg length and subject specific cost functions which are influenced by image quality and therefore by optical correction. In this experiment all subjects were normal acuity or corrected to normal (with no information regarding their uncorrected vision). This is now noted in the Methods. The goal of the present work was to calculate average statistics over a range of observers and conditions in order to constrain the experience-dependent properties one might see in neurophysiology. We have added between-subjects error bars to Figure 2 and added gaze angle distributions as a function of terrain for individual observers in the Supplementary materials. Figure 4 b and d now show standard errors across subjects. Individual subject plots are shown in the Supplementary materials. For Figure 2, most variability between subjects occurs in the Flat and Bark terrains where one might expect individual choices of energetic costs versus speed and stability etc might come into play. This is supported by our subsequent unpublished work on factors influencing foothold choice. We have also found that leg length determines path choices and thus will influence the retinal motion. Differences between observers are now noted in the text. These individual subject differences should indicate the range of variability that might be expected in the underlying neural properties and perhaps in behavioral sensitivity. Because of the size of our dataset (n=11) it is not feasible to make comparisons of sex or age. There were equal numbers of males and females and age ranged from 24 to 54. Now noted in the Methods section.

    Reviewer #2 (Public Review):

    The goal of this study was to provide in situ measurements of how combined eye and body movements interact with real 3D environments to shape the statistics of retinal motion signals. To achieve this, they had human walkers navigate different natural terrains while they measured information about eyes, body, and the 3D environment. They found average flow fields that resemble the Gibsonian view of optic flow, an asymmetry between upper and lower visual fields, low velocities at the fovea, a compression of directions near the horizontal meridian, and a preponderance of vertical directions modulated by lateral gaze positions.

    Strengths of the work include the methodological rigor with which the measurements were obtained. The 3D capture and motion capture systems, which have been tested and published before, are state-of-the-art. In addition, the authors used computer vision to reconstruct the 3D terrain structure from the recorded video.

    Together this setup makes for an exciting rig that should enable state-of-the-art measurements of eye and body movements during locomotion. The results are presented clearly and convincingly and reveal a number of interesting statistical properties (summarized above) that are a direct result of human walking behavior.

    A weakness of the article concerns tying the behavioral results and statistical descriptions to insights about neural organization. Although the authors relate their findings about the statistics of retinal motion to previous literature, the implications of their findings for neural organization remain somewhat speculative and inconclusive. An efficient coding theory of visual motion would indeed suggest that some of the statistics of retinal motion patterns should be reflected in the tuning of neural populations in the visual cortex, but as is the present findings could not be convincingly tied to known findings about the neural code of vision. Thus, the behavioral results remain strong, but the link to neural organization principles appears somewhat weak.

    We agree, but we think that strengthening the neural links requires future studies. As mentioned above, it is very difficult to relate the measured statistics to existing neurophysiological literature and we have tried to make this clearer in the Discussion (p14, 15, 16). This is because the stimuli chosen are typically arbitrary and not chosen to be realistic examples of patterns consistent with natural motion across a ground plane. Other stimuli are simply inconsistent with self-motion together with gaze stabilization (eg not zero velocity at the fovea). It has also been technically difficult to map cell properties across the visual field. We have made the comparisons we thought were useful. The point of the paper is to provide a hypothesis about the pattern of direction and speed tuning across the visual field. So the challenge for neurophysiology is to show how the observed cell properties vary across the visual field. Note also that the motion patterns will be influenced by the body motion of the animal in question, and because of this we are now collaborating with a group who are attempting to record from monkey MT/MST during locomotion while tracking eyes and body. Similarly we are training neural networks to learn the patterns generated by human gait to develop more specific hypotheses about receptive field properties.

    Reviewer #3 (Public Review):

    Gaze-stabilizing motor coordination and the resulting patterns of retinal image flow are computed from empirically recorded eye movement and motion capture data. These patterns are assessed in terms of the information that would be potentially useful for guiding locomotion that the retinal signals actually yield. (As opposed to the "ecological" information in the optic array, defined as independent of a particular sensor and sampling strategy).

    While the question posed is fundamental, and the concept of the methodology shows promise, there are some methodological details to resolve. Also, some terminological ambiguities remain, which are the legacy of the field not having settled on a standardized meaning for several technical terms that would be consistent across laboratory setups and field experiments.

    Technical limits and potential error sources should be discussed more. Additional ideas about how to extend/scale up the approach to tasks with more complex scenes, higher speed or other additional task demands and what that might reveal beyond the present results could be discussed.

    This issue is addressed in more detail in the Discussion, second paragraph, and also the second last paragraph.

  2. eLife assessment

    This important study should be of interest to vision scientists and those seeking to model naturalistic image processing for humans in simulated or real navigational [walking] situations. The experiments aim to provide information about the statistics of "retinal" motion patterns generated by human participants physically walking a straight path in real terrains that differ in "smoothness". State-of-the-art eye, head, and body tracking allowed simultaneous assessment of eye movements, head movements, and gait, with convincing evidence for an asymmetrical gradient of flow speeds during walking, tied predominantly to vertical gaze angle, together with a radial motion direction distribution tied most critically on horizontal gaze angle. While not a major weakness per se, additional details on analytical methods used and estimations of variance across observers would strengthen these results and clarify the basis of the global claims made about visual motion information across the visual field in walking humans.

  3. Reviewer #1 (Public Review):

    Much experimental work on understanding how the visual system processes optic flow during navigation has involved the use of artificial visual stimuli that do not recapitulate the complexity of optic flow patterns generated by actual walking through a natural environment. The paper by Muller and colleagues aims to carefully document "retinal" optic flow patterns generated by human participants walking a straight path in real terrains that differ in "smoothness". By doing so, they gain unique insights into an aspect of natural behavior that should move the field forward and allow for the development of new, more principled, computational models that may better explain the visual processing taking place during walking in humans.

    Strengths:
    Appropriate, state-of-the-art technology was used to obtain a simultaneous assessment of eye movements, head movements, and gait, together with an analysis of the scene, so as to estimate retinal motion maps across the central 90 deg of the visual field. This allowed the team to show that walkers stabilize gaze, causing low velocities to be concentrated around the fovea and faster velocities at the visual periphery (albeit more the periphery of the camera used than the actual visual field). The study concluded that the pattern of optic flow observed around the visual field was most likely related to the translation of the eye and body in space, and the rotations and counter-rotations this entailed to maintain stability. The authors were able to specify what aspects of the retinal motion flow pattern were impacted by terrain roughness, and why (concentration of gaze closer to the body, to control foot placement), and to differentiate this from the impact of lateral eye movements. They were also able to identify generalizable aspects of the pattern of retinal flow across terrains by subsampling identical behaviors in different conditions.

    Weaknesses:
    While the study has much to commend, it could benefit from additional methodological information about the computations performed to generate the data shown. In addition, an estimation of inter-individual variability, and the role of sex, age, and optical correction would increase our understanding of factors that could impact these results, thus providing a clearer estimate of how generalizable they are outside the confines of the present experiments.

  4. Reviewer #2 (Public Review):

    The goal of this study was to provide in situ measurements of how combined eye and body movements interact with real 3D environments to shape the statistics of retinal motion signals. To achieve this, they had human walkers navigate different natural terrains while they measured information about eyes, body, and the 3D environment. They found average flow fields that resemble the Gibsonian view of optic flow, an asymmetry between upper and lower visual fields, low velocities at the fovea, a compression of directions near the horizontal meridian, and a preponderance of vertical directions modulated by lateral gaze positions.

    Strengths of the work include the methodological rigor with which the measurements were obtained. The 3D capture and motion capture systems, which have been tested and published before, are state-of-the-art. In addition, the authors used computer vision to reconstruct the 3D terrain structure from the recorded video. Together this setup makes for an exciting rig that should enable state-of-the-art measurements of eye and body movements during locomotion. The results are presented clearly and convincingly and reveal a number of interesting statistical properties (summarized above) that are a direct result of human walking behavior.

    A weakness of the article concerns tying the behavioral results and statistical descriptions to insights about neural organization. Although the authors relate their findings about the statistics of retinal motion to previous literature, the implications of their findings for neural organization remain somewhat speculative and inconclusive. An efficient coding theory of visual motion would indeed suggest that some of the statistics of retinal motion patterns should be reflected in the tuning of neural populations in the visual cortex, but as is the present findings could not be convincingly tied to known findings about the neural code of vision. Thus, the behavioral results remain strong, but the link to neural organization principles appears somewhat weak.

  5. Reviewer #3 (Public Review):

    Gaze-stabilizing motor coordination and the resulting patterns of retinal image flow are computed from empirically recorded eye movement and motion capture data. These patterns are assessed in terms of the information that would be potentially useful for guiding locomotion that the retinal signals actually yield. (As opposed to the "ecological" information in the optic array, defined as independent of a particular sensor and sampling strategy).

    While the question posed is fundamental, and the concept of the methodology shows promise, there are some methodological details to resolve. Also, some terminological ambiguities remain, which are the legacy of the field not having settled on a standardized meaning for several technical terms that would be consistent across laboratory setups and field experiments.

    Technical limits and potential error sources should be discussed more. Additional ideas about how to extend/scale up the approach to tasks with more complex scenes, higher speed, or other additional task demands and what that might reveal beyond the present results could be discussed.