Innovative Bimanual Robot with Human-Level Dexterity
An innovative bimanual robot has been developed by scientists at the University of Bristol. This new system, called Bi-Touch, uses AI to inform its actions and displays tactile sensitivity close to human-level dexterity.
Revolutionizing Industries with Bi-Touch
The Bi-Touch system has the potential to revolutionize industries such as fruit picking and domestic service. It could also recreate touch in artificial limbs. This development is significant because it enables robots to carry out manual tasks by sensing what to do from a digital helper.
Tactile Feedback for Precise Sensing and Gentle Interaction
The researchers at the University of Bristol developed a virtual world simulation containing two robot arms equipped with tactile sensors. They trained the AI agents within this virtual world to achieve bimanual tasks that are tailored towards touch. The agents can then be directly applied to the real world without further training.
The tactile bimanual agent developed by the researchers can solve tasks even under unexpected perturbations and manipulate delicate objects in a gentle way. This level of tactile feedback is crucial for achieving human-level robot dexterity.
Deep Reinforcement Learning for Robot Learning
The Bi-Touch system uses Deep Reinforcement Learning (Deep-RL) to teach robots bimanual skills. Deep-RL is an advanced technique in the field of robot learning that allows robots to learn from trial and error. The agents make decisions by attempting various behaviors to achieve designated tasks, with rewards and punishments guiding their learning process.
By using this approach, the researchers successfully enabled the dual-arm robot to safely lift fragile items, such as a single Pringle crisp. The AI agent relies solely on proprioceptive and tactile feedback, intelligently learning from rewards and punishments to improve its manipulation skills.
Co-author Professor Nathan Lepora commented that the Bi-Touch system showcases an affordable approach using software and hardware for learning bimanual behaviors with touch. The code for the tactile dual-arm robot simulation will be open-source, allowing for further research and development in other tasks.
The lead author, Yijiong Lin, concluded that the Bi-Touch system allows a tactile dual-arm robot to learn solely from simulation and achieve various manipulation tasks in the real world. The ability to train AI agents in a virtual world within a short time frame opens up possibilities for tailored bimanual tasks.