Differences

This shows you the differences between two versions of the page.

Link to this comparison view

blog:ecai20-affordances [2020/08/21 08:29] – created daniel86blog:ecai20-affordances [2020/09/04 08:50] (current) – [Affordance Model at ECAI 2020] daniel86
Line 1: Line 1:
-====== Affordance Model at ECAI 2020 ======+====== A Formal Model of Affordances for Flexible Robotic Task Execution ======
  
 In recent years, it has been shown that robotic agents are able to generate human-like behavior for everyday tasks such as preparing a meal, or making a pancake. These demonstrations, while being very impressive, also impose a lot of hard constraints on the environment such that the deployment of, for example, a robotic cook to an arbitrary household is still impossible. The step change is to make it possible to deploy robots to many different environments, and for many different tasks without the need to re-program them. We believe that one of the key aspects towards the achievement of this goal is an abstraction of the interaction between the robotic agent and its environment. Such an abstraction enables the robot to execute plans without hardcoding interaction patterns. In recent years, it has been shown that robotic agents are able to generate human-like behavior for everyday tasks such as preparing a meal, or making a pancake. These demonstrations, while being very impressive, also impose a lot of hard constraints on the environment such that the deployment of, for example, a robotic cook to an arbitrary household is still impossible. The step change is to make it possible to deploy robots to many different environments, and for many different tasks without the need to re-program them. We believe that one of the key aspects towards the achievement of this goal is an abstraction of the interaction between the robotic agent and its environment. Such an abstraction enables the robot to execute plans without hardcoding interaction patterns.