Robot System Assistant (RoSA) evaluation of touch and speech input modalities for on-site HRI and telerobotics
As humans and robots collaborate more in industrial settings, flexible interaction modalities become essential. This study evaluates a hybrid interaction model where the same user engages with a robot both on-site and remotely using touch and speech inputs. The Robot System Assistant (RoSA) framework is tested with two robots, Rosa and Ari. Results show that touch input provides higher precision and efficiency, while speech input offers a more intuitive, natural interaction.
Robot System Assistant (RoSA): evaluation of touch and speech input modalities for on-site HRI and telerobotics
Future work scenarios envision increased collaboration between humans and robots, emphasizing the need for versatile interaction modalities. Robotic systems can support various use cases, including on-site operations and telerobotics. This study investigates a hybrid interaction model in which a single user engages with the same robot both on-site and remotely. Specifically, the Robot System Assistant (RoSA) framework is evaluated to assess the effectiveness of touch and speech input modalities in these contexts. The participants interact with two robots, Rosa and Ari, utilizing both input modalities. The results reveal that touch input excels in precision and task efficiency, while speech input is preferred for its intuitive and natural interaction flow. These findings contribute to understanding the complementary roles of touch and speech in hybrid systems and their potential for future telerobotic applications.
Citing
@article{strazdas2025robot,
title={Robot System Assistant (RoSA): evaluation of touch and speech input modalities for on-site HRI and telerobotics},
author={Strazdas, Dominykas and Busch, Matthias and Shaji, Rijin and Siegert, Ingo and Al-Hamadi, Ayoub},
journal={Frontiers in Robotics and AI},
volume={12},
pages={1561188},
year={2025},
publisher={Frontiers}
}