ViSP - first tutorial

In order to start using (for the very first time) ViSP, I followed the tutorial for installation from source for linux ubuntu: https://visp-doc.inria.fr/doxygen/visp-3.4.0/tutorial-install-ubuntu.html

I settled for the Quick ViSP installation, since probably most of the 3rd parties that compose the Advanced ViSP installation won't be useful for my particular goal. Besides that, even if some of the 3rd parties turn out to be useful, they can be installed afterward.

First, I created a ViSP workspace:

$ echo "export VISP_WS=$HOME/visp-ws" >> ~/.bashrc
$ source ~/.bashrc
$ mkdir -p $VISP_WS

Installed the standard (recommended) 3rd parties:

  $ sudo apt-get install libopencv-dev libx11-dev liblapack-dev libeigen3-dev libv4l-dev libzbar-dev libpthread-stubs0-dev libjpeg-dev libpng-dev libdc1394-22-dev

Cloned the source code:

$ cd $VISP_WS
$ git clone https://github.com/lagadic/visp.git

 And created the build folder, to build ViSP:

$ mkdir -p $VISP_WS/visp-build
$ cd $VISP_WS/visp-build
$ cmake ../visp
$ make -j4

Finally, I setted ViSP env variable:

$ echo "export VISP_DIR=$VISP_WS/visp-build" >> ~/.bashrc
$ source ~/.bashrc 
 

Now it's time to start the experiments!

I used a table board that I had to modeled to create the 3 files that you can see here: board model . The 3 required files are the ones with .cao .init and .xml extensions. I have already mentioned these 3 files on a previous post for this matter. 

After creating this board model paste, I built the live tracker cpp file. I have also another post on how to build compile and run c++ projects with .cpp and CMakeLists.txt files (here).

After changing the /visp-ws/visp/tutorial/tracking/model-based/generic/tutorial-mb-generic-tracker-live.cpp  file to include my board model 3 files, and to open my camera, I just needed go into the build folder and run the following command:

$ ./tutorial-mb-generic-tracker-live --model model/board/board.cao --tracker 0 

the option --tracker can be setted to:

  • tracker 0 = only moving edges
  • tracker 1 = only face keypoints
  • tracker 2 = hybrid (both edges & keypoints)

Since the white and plain board has no face keypoints, I chose to use the zero tracking mode.

This was the result:

I found very hard to continuously and correctly track the boar pose. Particularly, when the camera can only see the thin side of the board (height), the detection was never right. These kind of problems led me to conclude that this cannot be the way to use ViSP for my object tracking purpose.


 

Next steps will be trying again these steps in a much more controlled environment and with a well recognized object (with some distinct color or something like that).

All the source code used for this experiment can be found on my github page, here.




Comments

Popular posts from this blog

Remote Control of UR10e via MoveIt (ROS)

UR10e control architecture