PiVR has been developed by David Tadres and Matthieu Louis (Louis Lab).

8. Advanced topics

8.1. Simulate Real-Time Tracking

Imagine setting up your experiment: preparing the animals, booking the setup/room for a whole afternoon…and then the tracker does not track the animal half of the time!

It is quite frustrating sitting in a dark room trying to troubleshoot these kind of problems.

To alleviate situations like these, there is the option to simulate real time tracking after installing the PiVR software on a PC (=not on the Raspberry Pi).

  1. At the PiVR setup, double check that:

    1. you have set the resolution to resolution you want to use,

    2. that the pixel/mm is set correctly,

    3. that the framerate is identical to the framerate you are trying to achieve with real-time tracking.

    4. that you have selected the correct animal

  2. Then, record a video with these settings. Then record some more.

  3. Transfer the video data to your PC where you have installed the PiVR software and select the Debug->Simulate Online Tracking

    11_simulate_online.png
  4. Make sure the Animal Detection Method is the same as the one you want to use during Real-Time tracking.

    Note

    This has not been tested with Mode 2

  5. Select a single folder. You will now see the metadata created while the video was taken. Carefully inspect it to see if the settings are as you expect them.

    12_read_metadata.png
  6. Press the ‘Track Single Animal’ button - you will get a popup as soon as the animal detection algorithm detects a moving object.

    13_simulate_tracking_detection.png
  7. After pressing ok, you will see what the animal detection algorithm has defined as the animal.

    It is obvious something has gone wrong here as the image on the right (the binary image) has a lot of spots where the image is white (=areas which are considered to be the animal)

    14_simulate_tracking_detection_broken.png
  8. After pressing Ok, the tracking algorithm starts - as the animal has not been properly identified in the first frame, the tracking algorithm is unable to identify the animal during tracking as well:

    15_simulated_tracking_broken.png
  9. After going through the simulated tracking, the potential source of the problem has been identified: The animal can not be detected correctly. There are many reasons why this could be:

    1. The edge of the dish seems to have moved a bit during the first couple of frames (red rectangle). If you are able to stabilize the setup to ensure no movement while doing experiments, this problem should be solved.

      16_detection_problems1.png
    2. The fact that several spots in the middle of the dish are wrongly binarized as the potential animal, indicates that the detection algorithm has trouble setting the threshold correctly. This problem arises because the threshold is calculated as 2 standard deviations from the mean of the pixel intensities in the subtracted image pre_experiment.FindAnimal.define_animal_mode_one(). While the animal is the darkest spot in the image, the whole petri dish is darker than the background which might lead to this problem.

      17_detection_problem2.png

      There are two general ways to solve this problem:

      1. Optimize the imaging conditions so that the animal has a higher contrast to the background, which should be as homogenous as possible. See here for an example.

      2. Optimize the animal parameters. You can follow this guide to set stringent animal parameters for tracking.

  10. There are many ways how tracking can fail. Only a single example is described above. I hope the walkthrough will enable you to generally get an idea where during tracking the algorithm fails.

8.2. Tracking of a new animal

PiVR has been used to track a variety of animals: Walking adult fruit flies, fruit fly larvae, spiders, fireflies, pillbugs and zebrafish.

For the tracking algorithm to function optimally, it takes several “animal parameters” into account:

  1. The amount of pixels the animal will fill in the image.

  2. The “roundness” of the animal.

  3. The proportions of the animal.

  4. The length of the animal.

  5. The speed of the animal.

For each animal you can choose in Options->Select Organism these parameters were defined. You can find them in the file “list_of_available_organisms.json” in source code.

If you want to track an animal that is not on the list you can always try to use the “Not in list” option. However, the tracking algorithm might not work optimally.

There is a straightforward pipeline to collect the necessary animal parameters to optimize real-time tracking:

  1. Place your (single!!) animal in the arena you want to use for your experiment.

  2. As always, do not forget to define the pixel/mm.

  3. Select “Not in List” under Options->Select Organism.

  4. Record a video. If you use a fast animal, make sure to select a sufficiently high framerate. As always it is imperative that the camera and the arena are stable during recording, i.e. nothing in the image should move except the animal!

  5. Record for a couple of minutes, i.e. 5 minutes.

  6. Make sure you have videos with animals moving as fast as they might in your actual experiment.

  7. It is also necessary that the animals move for a large fraction of the video!

  8. Take the videos to your PC on which you have installed the PiVR software.

  9. To observe what the algorithm is doing, turn the Debug mode on. This is recommend as you will see immediately if and where something goes wrong. This can help you to solve tracking problems.

  10. Analyze each video using the Tools->Analysis: Single Animal Tracking option.

  11. If using the Debug mode, you will get informed as soon as the algorithm detects an object that is moving. It will also inform you how much space (in pixels) the detected animal occupies, its eccentricity (‘roundness’) and a parameter for proportions (Major axis over minor axis). If the identified object clearly is not the animal answer the question with “No” and the algorithm will look in the next frame the largest moving object.

    1_debug_detection1.png
  12. Next, you will see a side by side comparison of the original picture (with a box drawn around the detected animal and the binary image you have seen in the previous popup.

    2_debug_detection2.png
  13. The algorithm will then start tracking. You will see an overview of how the algorithm detects the animal: On the left you can see the original image. In the center you can see the binary image: The grey area indicates the search box (which depends on defined max speed of animal, pixel/mm and framerate) and in white the pixels that are below threshold. The black area is not considered as it is too fare away from the position of the animal in the previous frame. On the right, you can see the result of the tracking: A box drawn around the identified animal. In addition, you can see the animal parameters you are looking for. These are just for your information, read below to see how to comfortably get the list of these parameters.

    3_debug_tracking.png
  14. After running the Single Animal Tracking algorithm, you will find a number of new files in each experimental folder. To get to the animal parameters, open the file “DATE_TIME_heuristics.csv”, for example with excel.

    4_heuristics.png
  15. Each row in the table stands for one frame. The title of the column describes the value.

    5_heuristics_columns.png
  16. You need to get the following values to get all animal parameters:

    1. A minimum value for filled area (in mm)

    2. A maximum value for filled area (in mm)

    3. A minimum value for eccentricity

    4. A maximum value for eccentricity

    5. A minimum value for major over minor axis

    6. A maximum value for major over minor axis

    7. Maximum skeleton length

    8. Maximum speed of the animal (mm/s)

  17. As the tracking algorithm needs the extreme values to function properly, I have found it easiest to plot a Line Plot for each experiment for each of the relevant parameters. For example for the filled area:

    6_heuristic_plot_example.png
  18. Write down the maximum and minimum value for each of relevant parameters. In this example, the minimum value for filled area in mm would be ~25 and the maximum would be ~90.

  19. Do the same for eccentricity, major over minor axis, skeleton length and maximum speed (mm/s)

  20. Then do the same for a few other videos. The goal is to get extreme values without having to put 0 as minimum and infinity as maximum.

  21. In this example I have found the following parameters:

    1. Minimum value for filled area (in mm): 20

    2. Maximum value for filled area (in mm): 90

    3. Minimum value for eccentricity: 0.4

    4. Maximum value for eccentricity: 1

    5. Minimum value for major over minor axis: 1

    6. Maximum value for major over minor axis: 3.5

    7. Maximum skeleton length: 14

    8. Maximum speed of the animal (mm/s): 350

  22. Now go to the PiVR software folder on your PC and find the file named: “list_of_available_organisms.json”:

    7_open_organism_json.png
  23. Open it with an text editor. I often use “Code Writer” that ships with Windows. You will see that there are repeating structures: A word, defining the name of the animal, then a colon and then some image parameters in brackets.

    Note

    Json files require correct formatting. Be careful to not accidentally deleting commas etc.

    8_open_organism_json.png
  24. To enter your animal parameters you have two options: The easiest (and safest) option is to choose an animal in the list that you are certain to never use and just enter your parameters:

    9_modified_organism_json.png

    Alternatively, you may also enter a new “cell” at the end of the list. There is no limit on the number of different animals that can be entered in this list.

  25. Now save the file (do not rename it - If you want to keep a backup, rename the original, i.e. to “list_of_available_organisms_original.json”.)

  26. Restart the PiVR software (so that it reads the newly defined animal parameters).

  27. If you want to know whether PiVR is able to perform real-time tracking, you can open the “experiment_settings.json” files in one of the video folders you used to find the animal parameters (or a newly created video) and change the “Model Organism” cell name to your animal name

    10_change_experiment_settings.png
  28. Now, select the “Debug->Simulate Online Tracking” window, select a video and check whether the algorithm can track the animal in real-time. If not, you might have to select more stringent animal parameters and/or you have to optimize imaging conditions.

8.3. Create your own undistort files

The standard camera introduces a fisheye effect as can be observed below: In reality, the edges of the dish are straight.

18_distorted_image.png

This can lead to problems when collecting data.

PiVR allows for the correction of these optical effects based on a function in the opencv library.

Note

This option can not be turned on if opencv is not installed. If the menu is greyed out make sure to install opencv. In addition, you will have ‘noCV2’ written next to the version number of PiVR.

If you are on the Raspberry Pi the easiest way to install opencv is to wipe the SD card, reinstall the OS and make a clean install of the PiVR software using the PiVR installation file.

On a PC, just install it using conda by first (1) activating the PiVR environment and (2) then entering conda install -c conda-forge opencv

The image below demonstrates the functionality of the undistort algorithm.

19_undistorted_image.png

This option is always turned on after v1.7.0 unless you turned it off as described here.

Why is it so important to fix the distorted image? PiVR assigns x and y position of the animal based on the image it gets. If the input image is distorted these values will be off. For example, in the trajectory below the x/y positions of the animal differ visibly between the distorted original image and the undistorted image

24_example_trajectory.png

In case of presentation of a virtual reality, the arena gets presented based on the distorted image. This leads to distortion of the virtual reality.

For the undistort algorithm to work, it requires lens-specific distortion coefficients. PiVR comes with these coefficients for the standard lens you get when you buy from the BOM for all standard resolutions (640x480, 1024x768 and 1296x972).

Important

If you are using a different lens you must create your own undistort files. Read on to learn how to do so.

To create your own undistort files please follow the steps below.

Note

You will need to conduct this procedure for every resolution you want to employ in your experiments.

  1. Print this chessboard on a piece of paper.

  2. Go to your PiVR setup and place the printed chessboard on a well lit area (for example the light pad).

  3. Make sure you have selected the resolution you want to use in the future Resolution.

  4. Open the Timelapse Recording Window in the recording menu.

  5. Select a place to save the files. You might want to keep them just in case.

  6. Set it to record for 1 minute with one image every 2 seconds.

  7. Hit start.

  8. Take the camera into your hands and take images of the chessboard from different angles. See the collage below for an example of the different angles you want to get. Try to get the whole chessboard into the Field of view

    20_grabbing_pictures.png
  9. Once you are done, go to ‘Tools’ -> ‘Undistort, new lens’

    21_undistortnewLens.png
  10. Press the ‘Chessboard Images’ Button and select the folder where you saved the chessboard images you just took.

    22_undistortnewLensOptions.png
  11. Everything will now freeze for a couple of minutes. At one point you will start to see parts of the chessboard. If the images are not good (e.g. because parts of the chessboard are missing from the field of view) you will get an error message. Please re-take pictures.

  12. Once the algorithm is done, you should have a new set of undistort coefficent files on your local setup. If you want to make a copy, they are in PiVR/PiVR/undistort_matrices/user_provided.

  13. To use these matrices in your next experiment, press the ‘Options’ Menu in the Menu Bar. Then select ‘Undistort Options’.

    UndistortOptions.png
  14. In the popup select ‘Use your own undistort files’.

    23_undisortSelectOwnFiles.png
  15. Save the settings.

  16. From now on the output of online tracking is based on your own lens.