To derive meaningful navigation strategies,animals have to estimate their directional headings in the environment.Accordingly,this function is achieved by the head direction cells that were found in mammalian brains,w...To derive meaningful navigation strategies,animals have to estimate their directional headings in the environment.Accordingly,this function is achieved by the head direction cells that were found in mammalian brains,whose neural activities encode one’s heading direction.It is believed that such head direction information is generated by integrating self-motion cues,which also introduces accumulative errors in the long term.To eliminate such errors,this paper presents an efficient calibration model that mimics the animals’behavior by exploiting visual cues in a biologically plausible way,and then implements it in robotic navigation tasks.The proposed calibration model allows the agent to associate its head direction and the perceived egocentric direction of a visual cue with its position and orientation,and therefore to calibrate the head direction when the same cue is viewed again.We examine the proposed head direction calibration model in extensive simulations and real-world experiments and demonstrate its excellent performance in terms of quick association of information to proximal or distal cues as well as accuracy of calibrating the integration errors of the head direction.Videos can be viewed at https://videoviewsite.wixsite.com/hdc-calibration.展开更多
基金funding from the European Union’s Horizon 2020 Framework Programme for Research and Innovation under the Specific Grant Agreement No.945539(Human Brain Project SGA3)also funded by Pazhou Lab PZL2021KF0020.
文摘To derive meaningful navigation strategies,animals have to estimate their directional headings in the environment.Accordingly,this function is achieved by the head direction cells that were found in mammalian brains,whose neural activities encode one’s heading direction.It is believed that such head direction information is generated by integrating self-motion cues,which also introduces accumulative errors in the long term.To eliminate such errors,this paper presents an efficient calibration model that mimics the animals’behavior by exploiting visual cues in a biologically plausible way,and then implements it in robotic navigation tasks.The proposed calibration model allows the agent to associate its head direction and the perceived egocentric direction of a visual cue with its position and orientation,and therefore to calibrate the head direction when the same cue is viewed again.We examine the proposed head direction calibration model in extensive simulations and real-world experiments and demonstrate its excellent performance in terms of quick association of information to proximal or distal cues as well as accuracy of calibrating the integration errors of the head direction.Videos can be viewed at https://videoviewsite.wixsite.com/hdc-calibration.