This is an old revision of the document!


KnowRob & Docker --- developer documentation

Building Docker images

The Dockerfiles for the KnowRob containers can be found in this repository: https://github.com/knowrob/docker. You can build docker images from them using

  cd ~/docker/knowrob-daemon
  docker build -t knowrob/hydro-knowrob-daemon .
 
  cd ~/docker/knowrob-interactive
  docker build -t knowrob/hydro-knowrob-interactive .
 
  cd ~/docker/webapp
  docker build -t knowrob/webrob .

You can tag a version of an image using

docker tag 18df3323a0d8 knowrob/webrob:0.1.0

Sharing images via Docker hub

Instead of building all images from Dockerfiles, you can also upload and download them from/to Docker hub. The push/pull interaction is very similar to git and uses the 'tags' associated with an image for describing which one to share:

  # upload image:
  docker push knowrob/hydro-knowrob-daemon
 
  # download images (this can also be done with the 'update-images' script)
  docker pull knowrob/hydro-knowrob-daemon
  docker pull knowrob/webrob
  docker pull knowrob/knowrob_data
  docker pull knowrob/user_data

Running Docker containers

KnowRob containers

There are two versions of the KnowRob container: The interactive one starts a Bash shell that can be used for console-based interaction with the system. The other one is a daemonized version that starts KnowRob and json_prolog as a daemon without interactive shell.

  # interactive:
  docker run -t -P -i knowrob/hydro-knowrob-interactive /bin/bash
 
  # daemon mode:
  docker run -t -P -d knowrob/hydro-knowrob-daemon

Web application

The web app needs access to the Docker socket, so you have to start it like this:

docker run --rm -i -p 5000:5000 -v /var/run/docker.sock:/var/run/docker.sock knowrob/webrob python webrob.py

You can then access the Web interface at http://localhost:5000

MongoDB

  docker run -d -v /data/db --name mongo_data busybox true
  docker run -d -p 27017:27017 --volumes-from mongo_data --name mongo_db mongo

Importing data into MongoDB

If you want to import data into a dockerized MongoDB instance, first start an interactive container that mounts the host directory which contains the exported json files:

  docker run -t -P --rm --link mongo_db:mongo \
         -v /home/tenorth/work/roslog:/var/roslog \
         -i knowrob/hydro-knowrob-interactive /bin/bash

Then you can run the 'mongoimport' program from within the container. Make sure to set the host correctly, i.e. to refer to the MongoDB instance in the other container:

  cd /var/roslog/<experiment-id>
  mongoimport --host "$MONGO_PORT_27017_TCP_ADDR:$MONGO_PORT_27017_TCP_PORT" --db roslog --collection tf tf.json

Managing data using containers

Docker containers are cheap and usually short-lived, i.e. they can easily be started, stopped, removed and updated. The problem is how to store persistent data such as the content of a database or user-specific files. The approach we use are data-only containers – normal containers that contain nothing but a data volume that can be mounted by the other containers that provide the applications. These data-only containers shall not be removed, but can persist over longer time, also if the application containers get updated.

In the KnowRob use case, there is one container per user (called <user>_data), and another one for common data (called knowrob_data). The user container will serve as sandbox in which users can store their files, the knowrob_data container will provide commonly used data sets. The common data container knowrob_data is auto-built on DockerHub from the knowrob_data GithHub repository.

Creating the user data containers

The user containers are built from the user_data Dockerfile. It basically just creates a user 'ros' in order to set the permissions correctly and creates the directory '/home/ros/sandbox'. The following commands build a Docker image from this Dockerfile (needs to be done only when the Dockerfile gets updated) and start a container, in this case for the user 'demo'.

  cd docker/user-sandbox
  docker build -t knowrob/user_data .
  docker run --name demo_data knowrob/user_data true

Updating the common data container

The knowrob_data image is automatically built whenever new data is pushed to the knowrob_data GitHub repository. This image is, however, not automatically updated in the server's registry. In order to pull the newest version and to replace the running container with a new one, use the following commands:

  docker pull knowrob/knowrob_data:latest 
  docker rm knowrob_data
  docker run --name knowrob_data knowrob/knowrob_data:latest true

Starting a user container that includes the data volumes

If the KnowRob image has been built as described earlier, it can be started using the following command such that both the common and the user-specific data volumes are mounted:

  # interactive mode
  docker run -t -P -i --rm \
             --volumes-from knowrob_data \
             --volumes-from demo_data \
             --name demo \
             --link mongo_db:mongo \
             knowrob/hydro-knowrob-interactive \
             /bin/bash
 
  # daemon mode
  docker run -t -P -d --rm\
             --volumes-from knowrob_data \
             --volumes-from demo_data \
             --name demo \
             --link mongo_db:mongo \
             knowrob/hydro-knowrob-daemon
 

These examples are for the user 'demo', so 'demo' and 'demo_data' have to be replaced with the respective username.