PiVR has been developed by David Tadres and Matthieu Louis (Louis Lab).

5. PiVR Software Manual

Warning

If you have the High LED power version of PiVR you must take care to properly shield yourself and others from the potentially very strong LED light to protect eyes and skin!

Important

Several options will open a pop-up. You must close the pop-up in order to interact with the main window.

Important

The software has different functionality if run on a Raspberry Pi as compared to any other PC. This software manual is for the Raspberry Pi version of the software

5.1. The Menubar

To select a different window use the Menu Bar at the top of the window

Menubar.png

5.2. The Recording Menu

The Recording Menu lets you choose between different recording options. There are currently 4 different methods:

  1. Tracking – Online tracking of a single animal. Possibility of delivering a time dependent stimulus.

  2. VR Arena – Online tracking of a single animal. Present a virtual arena that will define how the stimulus is present in response to the position of the animal.

  3. Full Frame Recording – Record an image sequence. Possibility of delivering a time dependent stimulus.

  4. Video – Record a video (h264 format). Possibility of delivering a time dependent stimulus.

RecordingMenu.png

5.2.1. Camera Control Frame

In all of the recording options you have access to the Camera control frame. It can be used to turn the camera preview on (Cam On) and off (Cam Off). You can also control the size of the preview Window size.

Warning

The Camera preview is always on top of everything else of the screen. Use the Preview Window carefully!

CameraControlFrame.png

5.2.2. Experiment Control Frame – Tracking

The ‘Recording’ Option you choose is printed in Bold on top of the Experiment Control Frame. In this example it is ‘Online Tracking’.

ExperimentControlFrameTracking.png

Online tracking tracks a single animal.

You have to select a folder in which the experiment will be saved by clicking on the button to the right of ‘Save in:’

You can then give your experiment an identifier. Examples include genotypes or an experimental treatment. This information will be saved in your experiment folder.

If you want to present a Time Dependent Stimulus you can press the Button ‘Select Time Dependent Stim File’. Please make sure you follow the guidelines to learn how to prepare the file.

The figure below gives you a quick overview over the parameters used by the program:

  1. Pixel/mm: Essential: This value has to be set by you before you run your first experiment! See set Pixel/mm. You must change it after changing resolution or adjusting height of the camera relative to the arena!

  2. Framerate: The framerate you will be tracking the animal. See adjust image to see how to adjust the framerate.

    Warning

    There is a difference between the framerate the camera can deliver and the framerate the Raspberry Pi can handle. If you select a very high framerate you might get a lower framerate than expected. Always check the timestamps in the ‘data.csv’ if you are trying a new, higher framerate than before!

  3. VR stim at: N/A

  4. Animal Detection Mode: Either Mode 1, Mode 2 or Mode 3. See Select Animal Detection Mod.

  5. Cam Resolution: Indicates the resolution you selected. See adjust image to see how to change the resolution.

    Important

    For Online Tracking you can only use 640x480.

  6. Animal: Essential: for Online Tracking. See here for how to select an animal. See Define new animal in case you are working with an animal which is not listed. If you are having problems detecting your animal see here

Next, please enter the time you want to track the animal in the field below ‘Recording Time[s]’. Then hit ‘Start Tracking’

5.2.3. Experiment Control Frame – VR Arena

The ‘Recording’ Option you choose is printed in Bold on top of the Experiment Control Frame. In this example it is ‘Closed Loop Stimulation’.

ExperimentControlFrameVRArena.png

Closed Loop Stimulation tracks a single animal.

You have to select a folder in which the experiment will be saved by clicking on the button to the right of ‘Save in:’

You can then give your experiment an identifier. Examples include genotypes or an experimental treatment. This information will be saved in your experiment folder.

To present a virtual arena (stimulation depending on the position of the animal) press the ‘Select VR Arena’ Button and select an arena. Static virtual arenas are csv files. Dynamical virtual arenas are currently npy files. To learn how to create a new arena please see Create new VR Arena.

The box below gives you a quick overview over the parameters used by the program:

  1. Pixel/mm: Essential: This value has to be set by you before you run your first experiment! See set Pixel/mm. You must change it after changing resolution or adjusting height of the camera relative to the arena!

  2. Framerate: The framerate you will be using to track the animal. See adjust image to see how to adjust framerate.

    Warning

    There is a difference between the framerate the camera can deliver and the framerate the Raspberry Pi can handle. If you select a very high framerate you might get a lower framerate than expected. Always check the timestamps in the ‘data.csv’ if you are trying a new, higher framerate than before!

  3. VR stim at: Either Head, Centroid, Midpoint or Tail. See here how to turn it on.

  4. Animal Detection Mode: Either Mode 1, Mode 2 or Mode 3. See Select Animal Detection Mod.

  5. Cam Resolution: Indicates the resolution you selected. See adjust image to see how to change the resolution.

    Important

    For Closed Loop Experiments you can only use 640x480.

  6. Animal: Essential: for Closed Loop Experiments. See here for how to select an animal. See Define new animal in case you are working with an animal which is not listed. If you are having problems detecting your animal see here

Next, please enter the time you want to track the animal in the field below ‘Recording Time[s]’. Then hit ‘Start Tracking VR’

5.2.4. Experiment Control Frame – Full Frame Recording

The ‘Recording’ Option you choose is printed in Bold on top of the Experiment Control Frame. In this example it is ‘Image Sequence’.

ExperimentControlFrameImageSequence.png

Image Sequence just records still images without tracking anything. The advantage over video is that no compression of the image data is done. The disadvantage is that it is limited by the time it takes the Raspberry Pi to write the file on the SD card. If you are using a higher quality SD card, you will be able to write at a higher the framerate. However, it will probably always be lower than video.

You have to select a folder in which the experiment will be saved by clicking on the button to the right of ‘Save in:’

You can then give your experiment an identifier. Examples include genotypes or an experimental treatment. This information will be saved in your experiment folder.

If you want to present a Time Dependent Stimulus you can press the Button ‘Select Time Dependent Stim File’. Please make sure you follow the guidelines to learn how to prepare the file.

The box below gives you a quick overview over the parameters used by the program:

  1. Pixel/mm: This value indicates how many pixels are in one mm. You will need this value to be correct to calculate anything with distance afterwards (speed, distance to source etc.) See set Pixel/mm. You must change it after changing resolution or adjusting height of the camera relative to the arena!

  2. Framerate: The framerate you will be tracking the animal. See adjust image to see how to adjust framerate.

    Warning

    There is a difference between the framerate the camera can deliver and the framerate the Raspberry Pi can handle. If you select a very high framerate you might get a lower framerate than expected. Always check the timestamps in the ‘data.csv’ if you are trying a new, higher framerate than before!

  3. VR stim at: N/A

  4. Animal Detection Mode: NA.

  5. Cam Resolution: Indicates the resolution you selected. See adjust image to see how to change the resolution.

  6. Animal: Value that will be saved in ‘experiment_settings.json’.

Select the image format you want your images to be in: jpg, png, rbg, yuv or rgba. See here for details on the different formats.

Next, please enter the time you want to track the animal in the field below ‘Recording Time[s]’.

Then hit ‘Start Recording Images

5.2.5. Experiment Control Frame – Video

The ‘Recording’ Option you choose is printed in Bold on top of the Experiment Control Frame. In this example it is ‘Video’.

ExperimentControlVideo.png

As the name indicates, use this option to record videos. The advantage of this method over image sequence is it’s superior speed. The disadvantage, especially for scientific questions, might be that it compresses the image file in the temporal domain. See here for an introduction and the Wikipedia page for more details.

You have to select a folder in which the experiment will be saved by clicking on the button to the right of ‘Save in:’

You can then give your experiment an identifier. Examples include genotypes or an experimental treatment. This information will be saved in your experiment folder.

If you want to present a Time Dependent Stimulus you can press the Button ‘Select Time Dependent Stim File’. Please make sure you follow the guidelines to learn how to prepare the file.

The box below gives you a quick overview over the parameters used by the program:

  1. Pixel/mm: This value indicates how many pixels are in one mm. You will need this value to be correct to calculate anything with distance afterwards (speed, distance to source etc.) See set Pixel/mm. You must change it after changing resolution or adjusting height of the camera relative to the arena!

  2. Framerate: The framerate you will be tracking the animal. See adjust image to see how to adjust the framerate.

    Warning

    There is a difference between the framerate the camera can deliver and the framerate the Raspberry Pi can handle. If you select a very high framerate you might get a lower framerate than expected. Always check the timestamps in the ‘data.csv’ if you are trying a new, higher framerate than before!

  3. VR stim at: N/A

  4. Animal Detection Mode: NA.

  5. Cam Resolution: Indicates the resolution you selected. See adjust image to see how to change the resolution.

    Important

    For video you cannot use 2592x1944.

  6. Animal: Value that will be saved in ‘experiment_settings.json’.

Next, please enter the time you want to track the animal in the field below ‘Recording Time[s]’. Then hit ‘Start Recording Images

5.3. Preparing a Time Dependent Stimulus File

In your PiVR folder you can find a folder called ‘time_dependent_stim’. On a fresh install it is supposed to contain a single file: blueprint_stim_file.csv

When you open it with, e.g. excel or your csv viewing program of choice you’ll see that there are 5 columns and many rows:

TimeDepStimFile.png

The first Column (A) is the frame number. E.g. if you are recording at 30 frames per second the row 2-32 will define what’s going on in that time.

The second column defines what Channel 1 is doing at a given frame. 0 means the light is completely OFF. 100 means the light is completely ON. A number in between, e.g. 50 means that the light is on at 50/100=50%

The third (Channel 2), the fourth (Channel 3) and the fifth (Channel 4) use the same principle for the other channels.

It is important to notice that the stimulation file needs to be defined on a very low level: Frame Number. The same stimulus file will give different stimulations depending on the framerate. Therefore:

  1. Decide on a framerate for you experiment, as an example we’ll say you decide on 30fps

  2. Decide on a length of your experiment, for example 20 seconds

  3. Decide on the stimulation pattern, e.g. you want Channel 1 to be OFF for the first second and Channel 2 to be ON for the first second. Then you want to switch, Channel 1 is ON for 1 sec, Channel 2 is OFF for 1 sec

  4. You will need to set the first 30 (framerate * length of stimulus) rows of Channel 1 to 0

  5. And you will need to set the first 30 (framerate * length of stimulus) rows of Channel 2 to 100

  6. As you don’t care about Channel 3 and 4 you can leave it at zero

  7. At row # 2 (since you start at row #2 in excel) or frame # 30 (first column) you set Channel 1 to 100 for 30 rows (framerate * length of stimulus) to turn it ON and Channel 2 to 0 to turn it OFF

Notes:
  1. If you do not define enough rows for your experiment, e.g. if you want to run the 20 seconds experiment at 30frames per second but you only define what happens during the first 15 seconds (by only going to row 15*30=450 instead of row 20*30=600) the last value for each channel will be propagated, e.g. if row 450 is set to 100 and row 451 to 600 are not defined the value 100 will be used for the rest of the experiment.

  2. If you define more rows than you need for your experiments only the stimulation up to the point you record are used (this will behave as you probably expect)

5.4. Set Pixel/mm

In order to set Pixel/mm for your resolution, press the ‘Options’ Menu in the Menu Bar. Then select ‘Define Pixel/mm’

OptionsDefinePxMm.png

In the popup window you will see features:

  1. The resolution you are currently using. The defined value will only be valid for this resolution

  2. The Left and right cutoff slider. By moving them you can measure the distance.

  3. A slice of the image taken by the camera. You want to put something you can measure horizontally before the camera.

  4. A text field to enter a length you want to measure.

DistanceConfigurationOverview.png

Below an example of an adjusted distance configuration window. Once you are satisfied with the adjustments you’ve made hit the quit button.

DistanceConfigurationAdjusted.png

5.5. Adjust image

In order to set any options related to the image, press the ‘Options’ Menu in the Menu Bar. Then select ‘Optimize Image’.

OptionsOptimizeImage.png

This popup should being used to set up the image in the optimal way:

  1. Turn the camera on (‘Cam On’) if it’s not on already

  2. Adjust the preview size so that you can comfortably see both the preview and the popup.

  3. Set the framerate as desired.

  4. Press the ‘Update Preview Framerate’ button

  5. Set the resolution you’d like to use for the recording.

    Important

    For Online Tracking and Closed Loop Experiments only 640x480 is possible. For video you cannot use 2592x1944.

  6. Make sure the autoexposure button says ‘autoexp on’.

  7. Turn the Backlight Intensity up. It is normal to only see something above 150’000. 400’000-500’000 is often a good value to choose.

  8. If you have Backlight 2 intensity on one of the GPIOs (see define GPIO output channels) you can also adjust Backlight 2 intensity at this point.

  9. To test your output channels, slide the appropriate slider to the right. At the beginning of any experiments these will be turn off again. To keep a stimulus ON for the duration of the experiment use the Backlight 2 intensity.

OptimizeImageOverview.png

5.5.1. Set up optimal image

In order to set up optimal image parameters I usually do the following:

  1. Turn ‘Cam On’

  2. Set ‘autoexp on’

  3. Pull ‘Backlight Intensity’ slider all the way to the left (Image will be dark)

  4. Now pull the ‘Backlight Intensity’ slider to the right. As soon as I see an image in the camera I go another 100‘000 to the right - this way I’m not at the lower detection limit of the camera.

  5. Then I turn ‘autoexp off’

  6. Often it can improve the image if I pull the ‘Backlight Intensity’ slider a bit more to the right, effectively overexposing the image a bit.

5.6. Define GPIO output channels

In order to define GPIO output channels for your resolution, press the ‘Options’ Menu in the Menu Bar. Then select ‘define GPIO output channels’.

OptionsDefineOutputChannels.png
outputChannelSelection.png

The images on the far left indicate which of the outputs on the left of your setups are which GPIO (e.g. the one closest to the LED power input is GPIO#18).

There are 4 GPIO’s that can be used to control LEDs: GPIO#18, GPIO#17, GPIO#27 and GPIO#13. GPIO#18 and GPIO#13 are special as they are the only ones that are capable of using PWM frequencies above the kilohertz range.

Channel 1 is always defined as the channel that is used for the Virtual Arena experiments.

Channel 1, Channel 2, Channel 3 and Channel 4 can be seperately adressed using the time dependent stimulus files.

5.7. Turn Debug Mode ON/OFF

In order Turn Debug Mode On or Off press ‘Options’ Menu in the Menu Bar. Then go on ‘Turn Debug Mode…’ and select either ‘OFF’ or ‘ON’.

OptionsDebugMode.png

5.8. Select Animal Detection Mode

In order Define the Animal Detection Method press ‘Options’ Menu in the Menu Bar. Then press ‘Animal Detection Method’.

OptionsAnimalDetectionMethods.png

When in either ‘Online Tracking’ or ‘Closed Loop Stimulation the animal needs to be detected. There are 3 modes that can be used to detect the animal. For most cases Mode 1 (Standard) will be fine. If you need a clear background image consider Mode 2 or Mode 3.

SelectAnimalDetection.png

5.9. Select Organism

In order select an organism press ‘Options’ Menu in the Menu Bar. Then go on ‘Select Animal’ and select your animal.

OptionsSelectAnimal.png

5.10. Updating the software

In order to update the software on your RaspberryPi, press the ‘File’ Menu in the Menu Bar. Then go on ‘Update Software’.

Note

Please make sure you are connected to the Internet when updating.

FileUpdate.png

Technicalities:

This will first update our Linux by calling:

sudo update

Next, it will download the newest version from the gitlab repository by calling:

git pull

5.11. High/Low Power LED switch

In order to choose between High and low Power LED setups press ‘Options’ Menu in the Menu Bar. Then go on ‘High Power LEDs’.

OptionsHigLowPowerLEDSwitch.png

Select either Standard or High power version depending on the setup you have.

5.12. Select Body Part for VR stimulation

When running virtual reality experiments the cells you are interested in could be at different places of the animal.

PiVR allows you to present the virtual reality depending on different body parts identified during tracking.

OptionsVRStimulationPoint.png

You may choose different body parts that are defined during tracking.

Note

As the difference between centroid and midpoint is not straightforward, please see here for an explanation.

  1. The Head (standard) will probably make a lot of sense in many experiments as a lot of sensory neurons of many animals are located there. However be aware that the Head/Tail classification algorithm is not perfect and does make mistakes. There is no option to correct for wrong head/tail assignment during the experiment!

  2. The Centroid is probably the most consistently correct point during tracking. Please see here to see how it is defined.

  3. The Midpoint is similar to the centroid, but can be different in flexible animals such as fruit fly larvae.

  4. The tail is is the final option to choose from. We have used the presentation of the virtual reality based on tail position as a control in the past.

VRStimulationPoint_Menu.png

5.13. Animal Color Selection

Depending on your experimental setup, the animal can either be dark on white background due to transillumination, or white on dark background due to side illumination.

The standard setting is dark on white. If you need to change this setting, go to Options->

OptionMenuAnimalColor.png

Now just press the button above the image that describes your experiment.

AnimalColorOptions.png