Compiling new protobuf message definitions¶
This guide shows how to to compile and install .proto
files so they can be used in NRPCore experiments. It can be easily done by using the provided Python script, nrp_compile_protobuf.py
. Afterwards the compiled Protobuf message types can be used by gRPC Engines and TFs.
In order for the script to work it is important that all .proto
files which are to be compiled have a package specifier as described in the protobuf documentation. This package specifier is used to name compiled libraries and header files and generated classes. Compiling .proto files without a package specifier leads to a configuration error.
The script takes two arguments:
--proto_files_path
: path to the.proto
files which should be compiled. The current working directory by default--install_dir
: installation directory. By default this is the folder were NRPCore was installed
The script is installed with NRPCore, so it can be directly invoked from the command line. Eg.:
nrp_compile_protobuf.py --help
will print the script help information.
When executed, it will compile all the .proto
files found at proto_files_path
and install the compiled libraries in install_dir
. For a description of the compiled libraries see here.
Afterwards, the new message definitions will be available to exchange data by gRPC Engines through the Engine.DataPackMessage message type. This is the message type used by gRPC Engine servers to send data to gRPC Engine clients. It contains a DataPack Id and the data itself, stored in a data field of type Any. Any* is a type of field which can store any type of message. Those message types compiled with the provided Python script can be sent to gRPC Engine servers wrapped in Engine.DataPackMessage.
The Engine.DataPackMessage can be found in the folder nrp-core-msgs/protobuf/nrp_proto_defs.
Using compiled messages in GRPC Engines¶
All GRPC Engines use protobuf messages to exchange data between client and server. GRPC Engines have a configuration parameter, “ProtobufPackages”, which is of type array and contains the list of Protobuf Packages that can be used by the Engine. Each of the elements of this array can be the package specifier in any of the .proto files compiled by nrp-core or using the nrp_compile_protobuf.py script.
Additionally, the Python GRPC Engine can import and use generated Python protobuf modules. As described here, these are named <package>_pb2.py
, being <package> the package specifier in the .proto file in lowercase.
Modules from .proto files compiled with nrp-core can be found under nrp_protobuf package. E.g.:
from nrp_protobuf import dump_pb2
Modules compiled using nrp_compile_protobuf.py can be imported directly, as in the example below.
Using compiled messages in TFs¶
Python bindings are generated for compiled Protobuf messages which allow to use them in TFs. See here for more details.
For example, lets say that nrp_compile_protobuf.py
is used to compile a .proto
file with package name MyPackage
and containing one message definition MyMessage
. Then, a Python module with name mypackage
is generated. Containing two classes: MyPackageMyMessage
which wraps a MyPackage::MyMessage
c++ object; and MyPackageMyMessageDataPack
, wrapping a DataPack<MyPackage::MyMessage>
c++ object. This is all that is required to use the new Protobuf message definitions in TFs.
from mypackage import MyPackageMyMessageDataPack d = MyPackageMyMessageDataPack('model', 'engine') type(d.data) # returns <class 'mypackage.MyPackageMyMessage'>
Again, for more details about the Python wrappers generated for Protobuf messages refer to here.
Using compiled messages with the Python GRPC Engine¶
When subclassing GrpcEngineScript in a Python GRPC Engine, Python modules directly generated by protoc are used (instead of the Python wrappers described above). They can be used to register, set and get datapacks, as in the example below taken from the examples/tf_exchange
experiment.
"""Python Engine 1. Will get current engine time and make it accessible as a datapack""" from nrp_core.engines.python_grpc import GrpcEngineScript from nrp_protobuf import dump_pb2 class Script(GrpcEngineScript): def initialize(self): """Initialize datapack1 with time""" print("Engine 1 is initializing. Registering datapack...") self._registerDataPack("datapack1", dump_pb2.String) d = dump_pb2.String() d.string_stream = str(self._time_ns) self._setDataPack("datapack1", d) def runLoop(self, timestep_ns): """Update datapack1 at every timestep""" self._getDataPack("datapack1").string_stream = str(self._time_ns) print("DataPack 1 data is ", self._getDataPack("datapack1").string_stream) def shutdown(self): print("Engine 1 is shutting down") def reset(self): print("Engine 1 is resetting")
Nested Package Specifiers¶
As commented above, the package specifier in .proto files is used to name compiled libraries and header files. Thus, using the same package specifier in more than one .proto file will lead to one of them overwritting the other. In case of wishing to have Protobuf message definitions from multiple .proto files under the same namespace, you can use nested package specifiers instead. As in the code snippet below:
package SuperPackage.MyPackage; message MyMessage { uint32 somefield = 1; }
Compiling this code would generate the following components:
a
superpackage_mypackage.pb.h
header file generated byprotoc
.a
libProtoSuperPackageMyPackage.so
library, containing the cpp code generated byprotoc
.a
libNRPProtoSuperPackageMyPackageOps.so
linking to the former one and containing Protobuf conversion functions needed by GRPC Engine clients and serversa
<superpackage_mypackage>.so
library, also linking tolibProtoSuperPackageMyPackage.so
and containing Protobuf Python bindings to use the compiled msgs in Transceiver Functions
Complete example¶
Now lets see a full example. We’ll write and compile a new Protobuf message definition which will be used in a Python GRPC Engine to relay data through a TF to a DataTransfer Engine, which will log it into a file.
First lets put the code below in a .proto
file. If the extension is not .proto
it won’t be found by the compilation script.
syntax = "proto3"; package MyPackage; /* * Message used for testing */ message MyMessage { uint32 integer = 1; string str = 2; }
Now lets compile the file by executing the script from the folder were the file is contained:
nrp_compile_protobuf.py
After the script execution ends, the package is ready to be used with NRPCore. Lets test it by implementing an experiment in which a TF will read datapacks of type MyMessage
from a Python GRPC Engine and send them to an DataTransfer Engine which will log them into a file.
We’ll first add the experiment configuration file. Paste the code below into a file, e.g. simulation_config.json
:
{ "SimulationTimeout": 5, "EngineConfigs": [ { "EngineType": "python_grpc", "EngineName": "python_1", "ServerAddress":"localhost:1234", "PythonFileName": "engine_grpc.py", "ProtobufPackages": ["MyPackage"] }, { "EngineType": "datatransfer_grpc_engine", "EngineName": "datatransfer_engine", "ServerAddress": "localhost:9006", "dataDirectory": "data/test", "ProtobufPackages": ["MyPackage"], "dumps":[ {"name": "datapack_1", "network": false, "file": true} ] } ], "DataPackProcessingFunctions": [ { "Name": "tf", "FileName": "tf.py" } ] }
The line:
"ProtobufPackages": ["MyPackage"]
allows the Engines to exchange Protobuf messages from MyPackage
.
The line:
{"name": "datapack_1", "network": false, "file": true}
declares a DataPack to be logged, which will be send by a TF and will be of type MyMessage
.
Next lets add “engine_grpc.py” containing the Python Grpc Engine GrpcEngineScript:
from nrp_core.engines.python_grpc import GrpcEngineScript import mypackage_pb2 class Script(GrpcEngineScript): def initialize(self): self._registerDataPack("datapack_name", mypackage_pb2.MyMessage) d = mypackage_pb2.MyMessage() d.integer = self._time_ns d.str = "from grpc python engine" self._setDataPack("datapack_name", d) def runLoop(self, timestep_ns): self._getDataPack("datapack_name").integer = int(self._time_ns / 1e6) def shutdown(self): pass def reset(self): pass
It registers a datapack “datapack_1” of type mypackage_pb2.MyMessage and updates its integer field every time step with the current Engine simulation time. This datapack is fetch every time step by the Tf defined below and relayed to the Datatransfer Engine to be logged.
Now lets write the TF. Paste this code into a tf.py
file:
from nrp_core import * from mypackage import MyPackageMyMessageDataPack @EngineDataPack(keyword='datapack_1', id=DataPackIdentifier('datapack_1', 'python_1')) @TransceiverFunction("datatransfer_engine") def data_transfer(datapack_1): print(f"Relaying datapack with data: '{datapack_1.data.str}' '{datapack_1.data.integer}' to datatransfer_engine") datapack_1.engine_name = "datatransfer_engine" return [datapack_1]
Finally, lets execute the experiment:
NRPCoreSim -c simulation_config.json
After the experiment completes its execution there should be a new file data/test/<time_stamp>/datapack_1-0.data
with many rows containing the logged data from datapack_1
datapack.