dopaexpress.blogg.se

Future car carrier blocksworld
Future car carrier blocksworld












future car carrier blocksworld

In this paper, we present an adaptive control interface which allows users to specify goals based on an internal world model by incrementally building referring expressions to the objects in the world. This becomes even more important in dynamic environments in which manipulable objects are subject to change. While automated planning is able to decompose a complex task into a sequence of steps which reaches an intended goal, it is difficult to formulate such a goal without knowing the internals of the planning system and the exact capabilities of the robot. Increasing the accessibility of autonomous robots also for inexperienced users requires user-friendly and high-level control opportunities of robotic systems. Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2018), pp. Closed-Loop Robot Task Planning Based on Referring Expressions.

future car carrier blocksworld

Combined with high-level task planning based on referring expressions and an autonomous robotic system, interesting new perspectives open up for non-invasive BCI-based human–robot interactions.ĭaniel Kuhner, Johannes Aldinger, Felix Burget, Moritz Göbelbecker, Wolfram Burgard and Bernhard Nebel. As our results show, the system is capable of adapting to frequent changes in the environment and reliably accomplishes given tasks within a reasonable amount of time. Furthermore, we demonstrate the applicability and robustness of the system in real-world scenarios, considering fetch-and-carry tasks, close human–robot interactions and in presence of unexpected changes. We extensively evaluate the BCI in various tasks, determine the performance of the goal formulation user interface and investigate its intuitiveness in a user study. The system is composed of several interacting components: a brain–computer interface (BCI) that uses non-invasive neuronal signal recording and co-adaptive deep learning, high-level task planning based on referring expressions, navigation and manipulation planning as well as environmental perception. In this paper, we present a novel framework that allows these users to interact with a robotic service assistant in a closed-loop fashion, using only thoughts. While some users can make the effort to familiarize themselves with a robotic system, users with motor disabilities may not be capable of controlling such systems even though they need robotic assistance most. Traditional control modalities such as touch, speech or gesture are not necessarily suited for all users. Control interfaces typically get more complicated with increasing complexity of robotic tasks and environments. 2019.Īs autonomous service robots become more affordable and thus available for the general public, there is a growing need for user-friendly interfaces to control these systems. A service assistant combining autonomous robotics, flexible goal formulation, and deep-learning-based brain–computer interfacing. Schirrmeister, Chau Do, Joschka Boedecker, Bernhard Nebel, Tonio Ball and Wolfram Burgard. Fiederer, Johannes Aldinger, Felix Burget, Martin Völker, Robin T. (Show all abstracts) (Hide all abstracts) Johannes Aldingerĭaniel Kuhner, Lukas D.J.














Future car carrier blocksworld