Adding a dataset to

This document is meant for internal use. If you are not part of the developer team, these instructions may not work for you.

Components of a data set

A data set may consist of several different kinds of information and/or inference methods that all have to be added to the system. The following are the most common components for which we will describe below how to include them:

  • KnowRob package with Prolog predicates for reasoning
  • OWL file with instances of actions and events (and possibly other information)
  • Images and other files
  • MongoDB with recorded tf data
  • Query library with prepared demo queries

Required steps for adding your own data to the system

The recommended procedure is to first test your data and to develop your queries locally, without using the docker virtualization environment. Once things work there, you can add your information to the right repositories and rebuild the docker containers so that your data is available in the system.

Develop and test locally (without docker)

It is recommended to first develop and test your data on your local system using a normal KnowRob installation. In this setup, it is much easier to debug problems, and once things work here, you can easily transfer them to the 'containerized' version. You can start a local KnowRob system with the same setup as the public web interface by

roslaunch knowrob_roslog_launch knowrob.launch

This will start a webserver at http://localhost:1111 through which you can send queries. Note that most of the demo queries will not work unless you have the required data in your local MongoDB database. You can also start an interactive Prolog shell using the same package setup, but in order to see the visualizations in the browser, you also have to start the rosbridge server that bridges between the ROS world and the Web world:

roslaunch rosbridge_server rosbridge_websocket.launch
rosrun rosprolog rosprolog knowrob_roslog_launch

This will give you the identical terminal as the Prolog shell in the public Web interface, but you can use tools such as the guitracer for graphical debugging.

If you plan to use your own data in a MongoDB database, debugging becomes much easier if you first make sure the queries work from the Java-based MongoDBInterface (in knowrob_mongo), then test whether these methods can be called from Prolog using the interactive shell started by 'rosprolog', and only then test the system from within a docker container. You can create an Eclipse project from a rosjava package by calling ./gradlew eclipse in the root of the catkin package.

Add your own KnowRob package

If you are working on a totally new kind of data set, chances are that you would like to add your own KnowRob package with reasoning predicates specific for your data. Again, this is best to be developed locally first. This page describes the necessary steps for creating a KnowRob package. Once you have finished it, please add it to the knowrob_addons repository at GitHub – either directly, if you have commit rights, or by sending a pull request. Every time a ROS package has been added, the KnowRob (knowrob/hydro-knowrob-daemon) container has to be rebuilt.

Required actions:

Add your KnowRob package to your fork of knowrob_addons and send a pull request.

Add OWL files and images

The OWL files with the logged actions and plan events are the central entry point for the reasoning procedures. They can automatically be generated using the logging infrastructure described here. While you only have to add a KnowRob package when starting work on a completely new kind of data, OWL files are commonly added for each new experiment run.

These OWL files are stored in the knowrob_data GitHub repository. Whenever new data is pushed to this repository, a rebuild of the knowrob_data container is automatically triggered. In the same folder structure, we also store files such as images the robot has recorded. Please try to keep this repository in a reasonable size and, for instance, shrink images and store them as JPEG (which is smaller than PNG for photos).

Required actions:

Add your OWL files and images to a new subfolder in your fork of knowrob_data and send a pull request.

Add MongoDB data

Currently, the MongoDB data is stored in a database that is shared among all experiments. This means we have to log into the server and import your data into this database. Have a look here for commands for adding data to a MongoDB instance that is running inside a container without externally exposed ports.

Required actions:

Call the mongoimport program on the computer where the database is running (e.g. your laptop or the server). The example below is for tf data, please adapt to any other tables you may want to import:

mongoimport --db roslog --collection tf your-tf.json

Adding a query library

The entries in the query library are generated from a JSON file of the following format. Each entry in the 'query' list corresponds to one row in the generated query library. JSON parsers do not like syntax errors, so it is recommended to use a validator such as JSONLint to check the syntax before committing the file.

    "query": [
            "q": "",
            "text": "----- This is a title -----"
            "q": "member(A, [a,b,c]).",
            "text": "This is a demo entry"

These query library files define the entry point into your dataset, including which files and which KnowRob packages are to be loaded. They should be executable starting with a freshly launched knowrob_roslog_launch launch file. This means that the first entries in this query list usually initialize the system in the way you need it, e.g. load packages, parse OWL files etc. Currently, these queries libraries are also used to select which experiment is to be loaded: The URL will tell the system to load the query library from the file

Required actions:

Add your query library to the 'static' folder in the web-app container in the knowrob/docker repository.

Rebuilding the containers

The previous steps have added the different components of your data set (OWL log files, Prolog code, MongoDB dumps) to the right places in the system. To include this information into the docker containers, you will need to re-create some of the containers the openEASE system is composed of. Which containers have to be re-built depends on which components you have added to your data set.

For re-building them, you will need a checkout of the knowrob/docker repository that contains a set of Dockerfiles. Similar to Makefiles, they describe the steps that need to be performed for creating the respective containers. All paths in this section assume that this repository has been checked out to ~/docker. These commands will retrieve code from the GitHub repositories, i.e. before building the containers, all changes have to be pushed and all pull requests have to be merged into the master branch.

# if you have added your own KnowRob package in knowrob_addons:
cd ~/docker/knowrob-daemon
docker build -t knowrob/hydro-knowrob-daemon .
# If you have added an OWL file in knowrob_data, the container 
# will automatically be built, so you can just pull it once the
# build is finished. Check the build details tab here:
docker pull knowrob/knowrob_data:latest
docker rm knowrob_data
docker run --name knowrob_data knowrob/knowrob_data:latest true
# if you have added your own query library:
cd ~/docker/webrob
docker build -t knowrob/webrob .

Accessing your new experiment

After rebuilding the containers, you can start the system locally and see if you can access your data. The start-webrob script contains the commands for starting all necessary containers:


You should now be able to login to the system at http://localhost and then open http://localhost/exp/your-data. This page should load the query library you have created, and you should be able to call the queries from there.

Debugging the containerized setup

If you have tested your code with a 'normal' KnowRob without docker first, chances are good that things also work in the 'containerized' version. However, there may still be issues, and the following points may help to get started with debugging:

  • The terminal in which you executed the 'start-webrob' script will contain output of the 'webrob' container that serves the HTML/JavaScript front-end. Watch out for Python exceptions and HTTP response codes 404 (could not find files) or 500 (internal server error, often an exception in a program generating a web site).
  • To inspect the output of the KnowRob container, you can use the docker logs and docker attach commands on your user container. This container has the same name as your username, so the command would be docker logs demo for user 'demo'.
  • If you make changes to the query library and it is not reloaded, this may be related to the browser cache. Try opening that file directly (e.g. http://localhost/static/queries-your-data.json) and press 'reload' in the browser.

Deploying your code to the server

When everything works fine on your computer, you are ready to deploy the containers to the server at

Please be careful with updates of the server – only deploy well-tested code and refrain from deleting other people's containers! Especially when doing one of your first updates, try to do it when more experienced people are still around and sufficiently before any important deadline so that there is still time to fix problems that may occur.

The following steps need to be performed after logging into the server:

  1. Import your data into the MongoDB on the server as described above. You will need to copy your-tf.json to the server and run the mongoimport program from there.
  2. Rebuild the containers you have changed as you have done it on your local computer (see description above)
  3. The startup script is running in a screen session. You will need to resume that session, stop the currently running script, and start it again.
# connect to the server
# resume the screen session
screen -r
# stop the script with CRTL+C
# start the script again
# detach the screen session using CTRL+A+D