Being able to interact and communicate with robots in the same way we
interact with people has long been a goal of AI and robotics
researchers. In this paper, we propose a novel approach to
communicating a navigation task to a robot, which allows the user to
sketch an approximate map on a PDA and then sketch the desired robot
trajectory relative to the map. State information is extracted from the
drawing in the form of relative, robot-centered spatial descriptions,
which are used for task representation and as a navigation language
between the human user and the robot. Examples are included of two
hand-drawn maps and the linguistic spatial descriptions generated from
the maps.
|