.. index:: pair: page; Compiling new protobuf message definitions .. _doxid-tutorial_add_proto_definition: Compiling new protobuf message definitions ========================================== This guide shows how to to compile and install ``.proto`` files so they can be used in NRPCore experiments. It can be easily done by using the provided Python script, ``nrp_compile_protobuf.py``. Afterwards the compiled Protobuf message types can be used by gRPC Engines and TFs. In order for the script to work it is important that all ``.proto`` files which are to be compiled have a package specifier as described in the `protobuf documentation `__. This package specifier is used to name compiled libraries and header files and generated classes. Compiling .proto files without a package specifier leads to a configuration error. The script takes two arguments: * ``--proto_files_path`` : path to the ``.proto`` files which should be compiled. The current working directory by default * ``--install_dir`` : installation directory. By default this is the folder were NRPCore was installed The script is installed with NRPCore, so it can be directly invoked from the command line. Eg.: .. ref-code-block:: cpp nrp_compile_protobuf.py --help will print the script help information. When executed, it will compile all the ``.proto`` files found at ``proto_files_path`` and install the compiled libraries in ``install_dir``. For a description of the compiled libraries see :ref:`here `. Afterwards, the new message definitions will be available to exchange data by gRPC Engines through the *Engine.DataPackMessage* message type. This is the message type used by gRPC Engine servers to send data to gRPC Engine clients. It contains a :ref:`DataPack ` Id and the data itself, stored in a *data* field of type *Any*. Any\* is a type of field which can store any type of message. Those message types compiled with the provided Python script can be sent to gRPC Engine servers wrapped in *Engine.DataPackMessage*. The *Engine.DataPackMessage* can be found in the folder *nrp-core-msgs/protobuf/nrp_proto_defs*. .. _doxid-tutorial_add_proto_definition_1tutorial_using_proto_grpc: Using compiled messages in GRPC Engines ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ All GRPC Engines use protobuf messages to exchange data between client and server. GRPC Engines have a configuration parameter, "ProtobufPackages", which is of type array and contains the list of Protobuf Packages that can be used by the Engine. Each of the elements of this array can be the package specifier in any of the .proto files compiled by nrp-core or using the *nrp_compile_protobuf.py* script. Additionally, the :ref:`Python GRPC Engine ` can import and use generated Python protobuf modules. As described :ref:`here `, these are named ``_pb2.py``, being the package specifier in the .proto file in lowercase. Modules from .proto files compiled with nrp-core can be found under *nrp_protobuf* package. E.g.: .. ref-code-block:: cpp from nrp_protobuf import dump_pb2 Modules compiled using *nrp_compile_protobuf.py* can be imported directly, as in the example below. .. _doxid-tutorial_add_proto_definition_1tutorial_using_proto_tf: Using compiled messages in TFs ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Python bindings are generated for compiled Protobuf messages which allow to use them in TFs. See :ref:`here ` for more details. For example, lets say that ``nrp_compile_protobuf.py`` is used to compile a ``.proto`` file with package name ``MyPackage`` and containing one message definition ``MyMessage``. Then, a Python module with name ``mypackage`` is generated. Containing two classes: ``MyPackageMyMessage`` which wraps a ``MyPackage::MyMessage`` c++ object; and ``MyPackageMyMessageDataPack``, wrapping a ``:ref:`DataPack ``` c++ object. This is all that is required to use the new Protobuf message definitions in TFs. .. ref-code-block:: cpp from mypackage import MyPackageMyMessageDataPack d = MyPackageMyMessageDataPack('model', 'engine') :ref:`type `(d.data) # returns Again, for more details about the Python wrappers generated for Protobuf messages refer to :ref:`here `. .. _doxid-tutorial_add_proto_definition_1tutorial_using_proto_python: Using compiled messages with the Python GRPC Engine ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ When subclassing :ref:`GrpcEngineScript ` in a Python GRPC Engine, Python modules directly generated by protoc are used (instead of the Python wrappers described above). They can be used to register, set and get datapacks, as in the example below taken from the ``examples/tf_exchange`` experiment. .. ref-code-block:: cpp """Python Engine 1. Will get current engine time and make it accessible as a datapack""" from nrp_core.engines.python_grpc import GrpcEngineScript from nrp_protobuf import dump_pb2 class Script(GrpcEngineScript): def :ref:`initialize `(self): """Initialize datapack1 with time""" print("Engine 1 is initializing. Registering datapack...") self._registerDataPack("datapack1", dump_pb2.String) d = dump_pb2.String() d.string_stream = :ref:`str `(self._time_ns) self._setDataPack("datapack1", d) def runLoop(self, timestep_ns): """Update datapack1 at every timestep""" self._getDataPack("datapack1").string_stream = :ref:`str `(self._time_ns) print("DataPack 1 data is ", self._getDataPack("datapack1").string_stream) def :ref:`shutdown `(self): print("Engine 1 is shutting down") def :ref:`reset `(self): print("Engine 1 is resetting") .. _doxid-tutorial_add_proto_definition_1tutorial_add_proto_definition_nested_package_name: Nested Package Specifiers ~~~~~~~~~~~~~~~~~~~~~~~~~ As commented above, the package specifier in .proto files is used to name compiled libraries and header files. Thus, using the same package specifier in more than one .proto file will lead to one of them overwritting the other. In case of wishing to have Protobuf message definitions from multiple .proto files under the same namespace, you can use nested package specifiers instead. As in the code snippet below: .. ref-code-block:: cpp package SuperPackage.MyPackage; message MyMessage { uint32 somefield = 1; } Compiling this code would generate the following components: * a ``superpackage_mypackage.pb.h`` header file generated by ``protoc``. * a ``libProtoSuperPackageMyPackage.so`` library, containing the cpp code generated by ``protoc``. * a ``libNRPProtoSuperPackageMyPackageOps.so`` linking to the former one and containing Protobuf conversion functions needed by GRPC Engine clients and servers * a ``.so`` library, also linking to ``libProtoSuperPackageMyPackage.so`` and containing Protobuf Python bindings to use the compiled msgs in Transceiver Functions .. _doxid-tutorial_add_proto_definition_1tutorial_add_proto_definition_example: Complete example ~~~~~~~~~~~~~~~~ Now lets see a full example. We'll write and compile a new Protobuf message definition which will be used in a :ref:`Python GRPC Engine ` to relay data through a TF to a :ref:`DataTransfer Engine `, which will log it into a file. First lets put the code below in a ``.proto`` file. If the extension is not ``.proto`` it won't be found by the compilation script. .. ref-code-block:: cpp syntax = "proto3"; package MyPackage; /* * Message used for testing */ message MyMessage { uint32 integer = 1; string str = 2; } Now lets compile the file by executing the script from the folder were the file is contained: .. ref-code-block:: cpp nrp_compile_protobuf.py After the script execution ends, the package is ready to be used with NRPCore. Lets test it by implementing an experiment in which a TF will read datapacks of type ``MyMessage`` from a :ref:`Python GRPC Engine ` and send them to an :ref:`DataTransfer Engine ` which will log them into a file. We'll first add the experiment configuration file. Paste the code below into a file, e.g. ``simulation_config.json`` : .. ref-code-block:: cpp { "SimulationTimeout": 5, "EngineConfigs": [ { "EngineType": "python_grpc", "EngineName": "python_1", "ServerAddress":"localhost:1234", "PythonFileName": "engine_grpc.py", "ProtobufPackages": ["MyPackage"] }, { "EngineType": "datatransfer_grpc_engine", "EngineName": "datatransfer_engine", "ServerAddress": "localhost:9006", "dataDirectory": "data/test", "ProtobufPackages": ["MyPackage"], "dumps":[ {"name": "datapack_1", "network": false, "file": true} ] } ], "DataPackProcessingFunctions": [ { "Name": "tf", "FileName": "tf.py" } ] } The line: .. ref-code-block:: cpp "ProtobufPackages": ["MyPackage"] allows the Engines to exchange Protobuf messages from ``MyPackage``. The line: .. ref-code-block:: cpp {"name": "datapack_1", "network": false, "file": true} declares a :ref:`DataPack ` to be logged, which will be send by a TF and will be of type ``MyMessage``. Next lets add "engine_grpc.py" containing the Python Grpc Engine GrpcEngineScript: .. ref-code-block:: cpp from nrp_core.engines.python_grpc import GrpcEngineScript import mypackage_pb2 class Script(GrpcEngineScript): def :ref:`initialize `(self): self._registerDataPack("datapack_name", mypackage_pb2.MyMessage) d = mypackage_pb2.MyMessage() d.integer = self._time_ns d.str = "from grpc python engine" self._setDataPack("datapack_name", d) def runLoop(self, timestep_ns): self._getDataPack("datapack_name").integer = int(self._time_ns / 1e6) def :ref:`shutdown `(self): pass def :ref:`reset `(self): pass It registers a datapack "datapack_1" of type *mypackage_pb2.MyMessage* and updates its integer field every time step with the current Engine simulation time. This datapack is fetch every time step by the Tf defined below and relayed to the Datatransfer Engine to be logged. Now lets write the TF. Paste this code into a ``tf.py`` file: .. ref-code-block:: cpp from nrp_core import * from mypackage import MyPackageMyMessageDataPack @:ref:`EngineDataPack `(keyword='datapack_1', id=:ref:`DataPackIdentifier `('datapack_1', 'python_1')) @:ref:`TransceiverFunction `("datatransfer_engine") def data_transfer(datapack_1): print(f"Relaying datapack with data: '{datapack_1.data.str}' '{datapack_1.data.integer}' to datatransfer_engine") datapack_1.engine_name = "datatransfer_engine" return [datapack_1] Finally, lets execute the experiment: .. ref-code-block:: cpp NRPCoreSim -c simulation_config.json After the experiment completes its execution there should be a new file ``data/test//datapack_1-0.data`` with many rows containing the logged data from ``datapack_1`` datapack.