Mirror Test
The mirror_test.py implements the core code of the Multi-Robots Mirror Self-Recognition Test in “Toward Robot Self-Consciousness (II): Brain-Inspired Robot Bodily Self Model for Self-Recognition”.
The experiment is: three robots with identical appearance move their arms randomly in front of the mirror at the same time.
In the training stage, according to the spiking time difference of neurons in IPLM and IPLV, the robot learns the correlations between self-generated actions and visual feedbacks in motion by learning with spike timing dependent plasticity (STDP) mechanism.
In the test stage, the robot can predicts the visual feedback generated by its arm movement according to the training results. With the InsulaNet, the robot can identify which mirror image belongs to it.
In the result, Motion Detection shows the results of visual detection, and Motion Prediction shows the visual feedback generated by itself. The red line in the figure indicates that the robot determines that the corresponding mirror belongs to itself.
Differences from the original article: Since there is no motion error under the simulation conditions, the theta_threshold is set to zero.
Citation
If you find this package helpful, please consider citing it:
@article{zeng2018toward,
title={Toward robot self-consciousness (ii): brain-inspired robot bodily self model for self-recognition},
author={Zeng, Yi and Zhao, Yuxuan and Bai, Jun and Xu, Bo},
journal={Cognitive Computation},
volume={10},
number={2},
pages={307--320},
year={2018},
publisher={Springer}
}