Writing a Robot2Neuron TF

This tutorial assumes you have familiarized yourself with the Braitenberg experiment. If not, please consider reading the Tutorial setup first.

Same as for Neuron2Robot, a Robot2Neuron TF in Python is basically a Python function with a set of decorators. These decorators create a TF from a simple Python function by specifying where the function parameters come from and what should happen with the functions return value. Let us begin to manually implement the TF from above in Python code.


The following code will usually be generated by the BIBI configuration generator if BIBI Configurations are used.

Eye_sensor_transmit in Python

Hardly surprising, the declaration of a Robot2Neuron TF in Python looks very similar to the specification of a Neuron2Robot TF.

import hbp_nrp_cle.tf_framework as nrp

def eye_sensor_transmit(t):

This will define a new Robot2Neuron TF and add it to the default TF manager instance.

Connecting to the robot simulation

Similarly, the connection to the robot simulation is again done through a mapping decorator as follows:

@nrp.MapRobotSubscriber("camera", Topic('/husky/camera', sensor_msgs.msg.Image))
def eye_sensor_transmit(t, camera):

The decoration tells the CLE that the camera parameter originates from a robot topic with the arguments as provided. The camera parameter will now be a robot subscriber that provides two properties: The value which is the last received image and changed which indicates whether the value has changed since the last simulated step.

Connecting to the neuronal network

As we now have three different neuron groups, we do not use the return channel but use dedicated channels for the devices. That is, we use dedicated parameters and decorators as follows:

@nrp.MapRobotSubscriber("camera", Topic('/husky/camera', sensor_msgs.msg.Image))
@nrp.MapSpikeSource("red_left_eye", nrp.brain.sensors[slice(0, 3, 2)], nrp.poisson)
@nrp.MapSpikeSource("red_right_eye", nrp.brain.sensors[slice(1, 4, 2)], nrp.poisson)
@nrp.MapSpikeSource("green_blue_eye", nrp.brain.sensors[4], nrp.poisson)
def eye_sensor_transmit(t, camera, red_left_eye, red_right_eye, green_blue_eye):

This has the same effect as the XML from above except that in the Python implementation, we are not limited to using library functions but are free to implement the color detection directly. Thus, the Python way is more flexible but in the long term we aim to provide a better tool to support this through a graphical editor.