Table of Contents
Exchange information via RoboEarth
Note: The RoboEarth project has officially ended, and while the RoboEarth database will be kept running for a while, the code is currently unmaintained. Please use it at your own risk.
This tutorial lists some example queries for interacting with the RoboEarth web-based robot knowledge base. To learn more about RoboEarth, have a look at the project web site. You can also create an account at http://api.roboearth.org/ and explore the content of the database via the human-usable web interface.
In this tutorial, we will use the robot API for querying the database. The queries will be sent from KnowRob, and after download, the downloaded knowledge is available in the KnowRob system as if it had been loaded from a file.You can thus use the same queries as before for reasoning about objects and actions as in the previous tutorials.
During the download, you will possibly get some 'service call failed' error messages that you can ignore. They are caused by the fact that only part of the RoboEarth ecosystem gets launched. The system tries to send object models to the vision system and also to send maps to the respective components. If those components are not available, the service call has to fail.
Scenario
The demonstrated scenario is the following, also described in this paper: Robot 1, a PR2, performs a drink-serving task in one environment. As part of the task, it needs to open a cabinet to take the drink out. Since there is no information about the cabinet's articulation properties yet, the PR2 estimates the joint properties and creates the joint as part of the cabinet instance in the map. After having executed the task, it uploads the updated environment map to RoboEarth, and also extracts an updated model of the cabinet that now includes the joint information.
Robot 2, an Amigo robot, later performs the same task in a different environment. Since it has access to the updated object model, it can use the information about the articulation properties to open the cabinet door and take out the bottle.
During the download phase, the robots need to determine which information they actually need. This is done using the SRDL robot description language and dependencies of actions on capabilities defined using that language.
Installing and launching the system
In addition to KnowRob, you need a checkout of the current RoboEarth stack that you can install using
svn co https://ipvs.informatik.uni-stuttgart.de/roboearth/repos/public/tags/latest roboearth rosmake re_comm
You should then launch the interactive shell by typing
rosrun rosprolog rosprolog re_comm
There is also a launch file to start KnowRob as ROS node, offering a service that accepts queries:
roslaunch re_comm re_comm_knowrob.launch
Start the visualization windows:
visualisation_canvas(_), planvis_create(_).
Environment 1 -- PR2 robot
Download recipe and generate CPL plan (also downloads models for all objects the recipe refers to)
re_download_action_recipe('serve a drink', pr2:'PR2Robot1', Recipe), re_generate_cpl_plan(Recipe, CplPlan), planvis_load(Recipe,_).
Request environment maps (also downloads models for all objects in the map)
re_request_map_for([['kr:roomNumber',|3001]], [['kr:floorNumber',|'3']], [['kr:streetNumber',|'45']], [['rdfs:label',|'Karlstrasse']], M), owl_individual_of(Map, knowrob:'SemanticEnvironmentMap'), add_object_with_children(Map, _).
Find out where the bottle is (the plan does not say that it's inside the cabinet). Here: use computables to determine that it is inside the cabinet:
rdf_triple(knowrob: 'in-ContGeneric', 'http://ias.cs.tum.edu/kb/ias_hospital_room.owl#bottle1', C), owl_individual_of(C, knowrob:'Container').
Perform the task…
Highlight plan steps during execution:
planvis_highlight('http://www.roboearth.org/kb/roboearth.owl#PickUpBottle',_). planvis_highlight('http://www.roboearth.org/kb/roboearth.owl#MoveBaseToHandoverPose',_). planvis_highlight('http://www.roboearth.org/kb/roboearth.owl#ReachToHandoverPose',_). planvis_highlight('http://www.roboearth.org/kb/roboearth.owl#OpenGripperForHandover',_). planvis_clear_highlight(_).
Create joint between cabinet and door: (The robot fills in appropriate values as determined by the articulation estimator):
create_joint_information('HingedJoint', 'http://ias.cs.tum.edu/kb/ias_hospital_room.owl#cabinet1', 'http://ias.cs.tum.edu/kb/ias_hospital_room.owl#door1', [[1,0,0,1,0,1,0,1,0,0,1,1,0,0,0,1]], [[]], '0.33', '0.1', '0.5', Joint).
Export and upload information: First, the robot creates a TBOX class-level object model from the cabinet instance:
owl_export:tboxify_object_inst('http://ias.cs.tum.edu/kb/ias_hospital_room.owl#cabinet1', 'http://www.roboearth.org/kb/roboearth.owl#IkeaExpedit2x4-new', 'http://ias.cs.tum.edu/kb/ias_hospital_room.owl#cabinet1', 'http://www.roboearth.org/kb/roboearth.owl#IkeaExpedit2x4-new',tboxified).
The following queries could then be used to update information in the RoboEarth DB. They shall not be called in this tutorial since it will change information in the DB and thereby render it unusable to others who do the tutorial later.
** DO NOT CALL! ** re_update_object('http://www.roboearth.org/kb/roboearth.owl#IkeaExpedit2x4-new', 'cabinet.ikeaexpedit2x4-new', 'updated by PR2'). ** DO NOT CALL! ** re_update_map('http://ias.cs.tum.edu/kb/ias_hospital_room.owl#SemanticEnvironmentMap7635-new', 'semanticenvironmentmap.semanticenvironmentmap7635-new', 'updated by PR2')
Environment 2 -- Amigo robot
Please restart KnowRob to simulate that we are on a different robot that has not yet downloaded the information. Note that the information given in the following queries differs: The task is downloaded for the 'AmigoRobot1', which has different components and capabilities than the PR2, and the address of the environment changed.
Download recipe and generate CPL plan (also downloads models for all objects the recipe refers to)
re_download_action_recipe('serve a drink', amigo:'AmigoRobot1', Recipe), re_generate_cpl_plan(Recipe, CplPlan).
Request environment maps (also downloads models for all objects in the map)
re_request_map_for([['kr:roomNumber',|'03.07.011']], [['kr:floorNumber',|'3']], [['kr:streetNumber',|'3']], [['rdfs:label',|'Boltzmannstrasse']], M). knowrob_coordinates:update_instance_from_class_def( 'http://www.roboearth.org/kb/roboearth.owl#IkeaExpedit2x4-new', 'http://ias.cs.tum.edu/kb/fmi_hospital_room.owl#cabinet1').
Reasoning about objects in the map
rdf_triple(knowrob:'in-ContGeneric', 'http://ias.cs.tum.edu/kb/fmi_hospital_room.owl#bottle1', C), owl_individual_of(C, knowrob:'Container').
Perform the task…
Read joint information (Here you need to fill in the correct joint ID!)
read_joint_information(Joint, Type, Parent, Child, Pose, Direction, Radius, Qmin, Qmax).
Export and upload information:
owl_export:tboxify_object_inst('http://ias.cs.tum.edu/kb/fmi_hospital_room.owl#cabinet1', 'http://www.roboearth.org/kb/roboearth.owl#IkeaExpedit2x4-new', 'http://ias.cs.tum.edu/kb/fmi_hospital_room.owl#cabinet1', 'http://www.roboearth.org/kb/roboearth.owl#IkeaExpedit2x4-new',tboxified). ** DO NOT CALL! ** re_update_object('http://www.roboearth.org/kb/roboearth.owl#IkeaExpedit2x4-new', 'cabinet.ikeaexpedit2x4', 'updated by PR2'). ** DO NOT CALL! ** re_update_map('http://ias.cs.tum.edu/kb/fmi_hospital_room.owl#SemanticEnvironmentMap7635', 'semanticenvironmentmap.semanticenvironmentmap7398', 'updated by PR2')