
Figure: Schematic of our perception-action model for active vision. Adapted from Sharafeldin et al, 2024
We are currently at the beginning stage of this project. In a recent publication from the Choi lab (Sharafeldin et al 2024, “Active sensing with predictive coding and uncertainty minimization”, Cell Press), a perception-action model that uses a biologically motivated theory of predictive coding for perception and uncertainty minimization for action, was developed.
This model is generally applicable to any exploration setting in a task-independent manner and therefore serves as a plausible model of biological systems generating the latent states of the world through active sensing. The goal of this project is to adopt this model to describe tracking behaviors of moths when uncertainty and predictability of the visual scene are modulated by both spatial unexpectedness such as partial occlusion and cluttered scenes as well as temporal unexpectedness introduced by random motions of the objection. We will first simulate tracking behaviors in our model and compare the latent variables of the model to neural recordings from multiple visuomotor areas of moths navigating in complex environments with varying degrees of uncertainty, in collaboration with the Theobald lab.