Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
doc:docker:add-dataset [2014/10/14 16:38] – [Deploying your code to the server] admindoc:docker:add-dataset [2014/12/07 08:30] (current) admin
Line 19: Line 19:
  
 The recommended procedure is to first test your data and to develop your queries locally, without using the docker virtualization environment. Once things work there, you can add your information to the right repositories and rebuild the docker containers so that your data is available in the system. The recommended procedure is to first test your data and to develop your queries locally, without using the docker virtualization environment. Once things work there, you can add your information to the right repositories and rebuild the docker containers so that your data is available in the system.
 +
 ==== Develop and test locally (without docker) ==== ==== Develop and test locally (without docker) ====
  
Line 30: Line 31:
      
 This will give you the identical terminal as the Prolog shell in the public Web interface, but you can use tools such as the [[http://swi-prolog.org/pldoc/man?section=guitracer|guitracer]] for graphical debugging. This will give you the identical terminal as the Prolog shell in the public Web interface, but you can use tools such as the [[http://swi-prolog.org/pldoc/man?section=guitracer|guitracer]] for graphical debugging.
 +
 +If you plan to use your own data in a MongoDB database, debugging becomes much easier if you first make sure the queries work from the Java-based MongoDBInterface (in knowrob_mongo), then test whether these methods can be called from Prolog using the interactive shell started by 'rosprolog', and only then test the system from within a docker container. You can create an Eclipse project from a rosjava package by calling //./gradlew eclipse// in the root of the catkin package.
 +
  
 ==== Add your own KnowRob package ==== ==== Add your own KnowRob package ====
Line 37: Line 41:
 == Required actions: == == Required actions: ==
 Add your KnowRob package to your fork of knowrob_addons and send a pull request. Add your KnowRob package to your fork of knowrob_addons and send a pull request.
 +
 +
 +
 ==== Add OWL files and images ==== ==== Add OWL files and images ====
  
-The OWL files with the logged actions and plan events are the central entry point for the reasoning procedures. They can automatically be generated using the logging infrastructure described [[http://www.cram-system.org/doc#logging_infrastructure|here]]. While the ROS packages only have to be modified when a completely new kind of data set is to be integrated, OWL files are added more frequently whenever a new experiment run has been performed.+The OWL files with the logged actions and plan events are the central entry point for the reasoning procedures. They can automatically be generated using the logging infrastructure described [[http://www.cram-system.org/doc#logging_infrastructure|here]]. While you only have to add a KnowRob package when starting work on a completely new kind of data, OWL files are commonly added for each new experiment run.
  
 These OWL files are stored in the [[https://github.com/knowrob/knowrob_data|knowrob_data]] GitHub repository. Whenever new data is pushed to this repository, a rebuild of the [[https://registry.hub.docker.com/u/knowrob/knowrob_data/|knowrob_data container]] is automatically triggered. In the same folder structure, we also store files such as images the robot has recorded. Please try to keep this repository in a reasonable size and, for instance, shrink images and store them as JPEG (which is smaller than PNG for photos). These OWL files are stored in the [[https://github.com/knowrob/knowrob_data|knowrob_data]] GitHub repository. Whenever new data is pushed to this repository, a rebuild of the [[https://registry.hub.docker.com/u/knowrob/knowrob_data/|knowrob_data container]] is automatically triggered. In the same folder structure, we also store files such as images the robot has recorded. Please try to keep this repository in a reasonable size and, for instance, shrink images and store them as JPEG (which is smaller than PNG for photos).
Line 45: Line 52:
 == Required actions: == == Required actions: ==
 Add your OWL files and images to a new subfolder in your fork of knowrob_data and send a pull request. Add your OWL files and images to a new subfolder in your fork of knowrob_data and send a pull request.
 +
 +
 +
 ==== Add MongoDB data ==== ==== Add MongoDB data ====
  
-Currently, the MongoDB data is stored in a database that is shared among all experiments. This means we have to log into the server and import your data into this database.+Currently, the MongoDB data is stored in a database that is shared among all experiments. This means we have to log into the server and import your data into this database. Have a look [[/doc/docker/dev|here]] for commands for adding data to a MongoDB instance that is running inside a container without externally exposed ports.
  
 == Required actions: ==  == Required actions: == 
Line 57: Line 67:
 ==== Adding a query library ==== ==== Adding a query library ====
  
-The entries in the query library, that is commonly in the lower left corner of the interface, are generated from a JSON file of the following format. Each entry in the 'query' list corresponds to one row in the generated query library. JSON parsers do not like syntax errors, so it is recommended to use a validator such as [[http://jsonlint.com/|JSONLint]] to check the syntax before committing the file.+The entries in the query library are generated from a JSON file of the following format. Each entry in the 'query' list corresponds to one row in the generated query library. JSON parsers do not like syntax errors, so it is recommended to use a validator such as [[http://jsonlint.com/|JSONLint]] to check the syntax before committing the file.
  
 <code javascript> <code javascript>
Line 74: Line 84:
 </code> </code>
  
-These query library files define the entry point into your dataset, including which files and which KnowRob packages are to be loaded. They should be executable starting with a freshly launched knowrob_roslog_launch launch file. This means that the first entries in this query list usually initialize the system in the way you need it, e.g. load packages, parse OWL files etc. Currently, these queries libraries are also used to select which experiment is to be loaded: The URL http://data.open-ease.org/exp/your-data will tell the system to load the query library from the file //queries-your-data.json//.+These query library files define the entry point into your dataset, including which files and which KnowRob packages are to be loaded. They should be executable starting with a freshly launched knowrob_roslog_launch launch file. This means that the first entries in this query list usually initialize the system in the way you need it, e.g. load packages, parse OWL files etc. Currently, these queries libraries are also used to select which experiment is to be loaded: The URL http://data.open-ease.org/exp/your-data will tell the system to load the query library from the file http://data.open-ease.org/static/queries-your-data.json.
  
 == Required actions: == == Required actions: ==
Line 101: Line 111:
      
 # if you have added your own query library: # if you have added your own query library:
-cd ~/docker/webapp+cd ~/docker/webrob
 docker build -t knowrob/webrob . docker build -t knowrob/webrob .
 </code> </code>
Line 109: Line 119:
 After rebuilding the containers, you can start the system locally and see if you can access your data. The //start-webrob// script contains the commands for starting all necessary containers: After rebuilding the containers, you can start the system locally and see if you can access your data. The //start-webrob// script contains the commands for starting all necessary containers:
  
-  ./start-webrob+  ~/docker/scripts/start-webrob
  
 You should now be able to login to the system at http://localhost and then open http://localhost/exp/your-data. This page should load the query library you have created, and you should be able to call the queries from there. You should now be able to login to the system at http://localhost and then open http://localhost/exp/your-data. This page should load the query library you have created, and you should be able to call the queries from there.
Line 128: Line 138:
 ==== Deploying your code to the server ==== ==== Deploying your code to the server ====
  
-When verything works fine on your computer, you are ready to deploy the containers to the server at data.open-ease.org. +When everything works fine on your computer, you are ready to deploy the containers to the server at data.open-ease.org. 
  
-**Please be careful with updates of the server -- only deploy well-tested code and refrain from deleting containers if not needed!** When you are doing one of your first updates, try to do it when more experienced people are still around and sufficiently before any important deadline so that there is still time to fix problems that may occur.+Please be careful with updates of the server -- only deploy well-tested code and refrain from deleting other people'containers! Especially when doing one of your first updates, try to do it when more experienced people are still around and sufficiently before any important deadline so that there is still time to fix problems that may occur.
  
 The following steps need to be performed after logging into the server: The following steps need to be performed after logging into the server: