A manuscript has been submitted and a pre-print now exists for this project!

Figure: (a) The 2-part robotic flower. (b) Vision and mechanosensation sum linearly across luminance contexts. (c) Despite changes in individual sensory modalities, the tracking performance is maintained similarly across contexts. (d) Mechanosensory gain decreases and visual gain increases with luminance. (e) The multiple-input-single-output control of hover-feeding using vision and mechanosensation.
The project investigates how visual and mechanosensory information is combined for flower-tracking across changes in luminance. This was done by using a two-part robotic flower, which is able to provide separate visual and mechanosensory cues to hover-feeding hawkmoths. The façade provides visual information while the nectary provides mechanosensory cues. By providing cues coherently and in conflict, we were able to construct transfer functions for feedback control at both high and low luminance. Linear combination of vision and mechanosensation was confirmed across luminance levels. Frequency-dependent gains and phase lags in vision were found to be compensated for by gain and phase changes in mechanosensory response. The resulting tracking performance was thus robust to changes in luminance, through multisensory, frequency-dependent adaptations in sensing and control.
This free-behavior project is now complete, and its results are being used in multistage neural recording projects.