In this paper we introduce Haptic-Informed ACT, an advanced robotic system for pseudo oocyte manipulation, integrating multimodal information and Action Chunking with Transformers (ACT). Traditional automation methods for oocyte transfer rely heavily on visual perception, often requiring human supervision due to biological variability and environmental disturbances. Haptic-Informed ACT enhances ACT by incorporating haptic feedback, enabling real-time grasp failure detection and adaptive correction. Additionally, we introduce a 3D-printed TPU soft gripper to facilitate delicate manipulations. Experimental results demonstrate that Haptic-Informed ACT improves the task success rate, robustness, and adaptability compared to conventional ACT, particularly in dynamic environments. These findings highlight the potential of multimodal learning in robotics for biomedical automation.
@inproceedings{uriguen2025hapticact,
author={Uriguen Eljuri, Pedro Miguel and Shibata, Hironobu and Maeyama, Katsuyoshi and Jia, Yuanyuan and Taniguchi, Tadahiro},
title={Haptic-Informed ACT with a Soft Gripper and Recovery-Informed Training for Pseudo Oocyte Manipulation},
year={2025},
eprint={2506.18212},
archivePrefix={arXiv},
primaryClass={cs.RO},
url={https://arxiv.org/abs/2506.18212},
}
This work was supported by the Japan Science and Technology Agency (JST) Moonshot Research & Development Program, Grant Number JPMJMS2033.