Installation Instructions

The instructions required to install NRP-core from the source (the only option currently available) are listed below.

Requirements

WARNING: Previous versions of the NRP install forked versions of several libraries, particularly NEST and Gazebo. Installing NRP-core in a system where a previous version of NRP is installed is known to cause conflicts. We strongly recommend not to do it.

Operative System

NRP-core has only been tested on Ubuntu 20.04 at the moment, and therefore this OS and version are recommended. Installation in other environments might be possible but has not been tested yet.

NEST

NRP-core only supports NEST 3.

As part of the installation process NEST 3 is built and installed. If you have an existing installation of NEST and don’t NRP-Core to mess with your environment add the parameter -DBUILD_NEST_ENGINE_SERVER=OFF to your NRP-Core cmake command. The NEST Engine won’t be built in this case but still can be used from a docker container in experiments.

In any case, be aware that NEST 2.x is incompatible with NRP-core.

Dependency Installation

# Start of dependencies installation
# 1 - Pistache REST Server
sudo add-apt-repository ppa:pistache+team/unstable

# 2- Gazebo Install
sudo sh -c 'echo "deb http://packages.osrfoundation.org/gazebo/ubuntu-stable `lsb_release -cs` main" > /etc/apt/sources.list.d/gazebo-stable.list'
wget https://packages.osrfoundation.org/gazebo.key -O - | sudo apt-key add -

sudo apt update
sudo apt install git cmake libpistache-dev libboost-python-dev libboost-filesystem-dev libboost-numpy-dev libcurl4-openssl-dev nlohmann-json3-dev libzip-dev cython3 python3-numpy libgrpc++-dev protobuf-compiler-grpc libprotobuf-dev doxygen libgsl-dev libopencv-dev python3-opencv python3-pil python3-pip libgmock-dev libclang-dev libomp-dev

# required by gazebo engine
sudo apt install libgazebo11-dev gazebo11 gazebo11-plugin-base

# 3- Install required python packages
# Remove flask if it was installed to ensure it is installed from pip
sudo apt remove python3-flask python3-flask-cors
# required by Python engine
# If you are planning to use The Virtual Brain framework, you will most likely have to use flask version 1.1.4.
# By installing flask version 1.1.4 markupsafe library (included with flask) has to be downgraded to version 2.0.1 to run properly with gunicorn
# You can install that version with
# pip install flask==1.1.4 gunicorn markupsafe==2.0.1
pip install flask gunicorn paho-mqtt

# required by nest-server (which is built and installed along with nrp-core)
sudo apt install python3-restrictedpython uwsgi-core uwsgi-plugin-python3
pip install flask_cors mpi4py docopt

# required by nrp-server, which uses gRPC python bindings
pip install "grpcio-tools>=1.49.1" pytest psutil docker

# Required for using docker with ssh
pip install paramiko

# The Python packages 'python_on_whales' and 'pyyaml' are optionally required to invoke nrp-core remotely with the Docker Compose (see guides/remote_docker_compose.dox for details).
pip install python-on-whales pyyaml


# 4- Installing ROS

# Install ROS: follow the installation instructions: http://wiki.ros.org/noetic/Installation/Ubuntu. To enable ros support in nrp on `ros-noetic-ros-base` is required.

# 5- Setting CATKIN workspace
# If there is an existing catkin workspace in your environment and you would like nrp-core to use it, export the variable CATKIN_WS pointing to it:
# E.g. export CATKIN_WS=<path to your catkin workspace>
# Otherwise nrp-core will create and compile a new catkin workspace at: ${HOME}/catkin_ws

# 6- Install SpiNNaker
# Follow the instructions at: https://spinnakermanchester.github.io/development/gitinstall.html.
# Ensure that if using a virtualenv, this is active when running any SpiNNaker scripts.

# 8- Installing Paho MQTT C and CPP
# MQTT Paho library, required by datatransfer engine for streaming data over network
# More information on the project web site https://github.com/eclipse/paho.mqtt.cpp
# If you do not want to add network data streaming feature, you can skip this step.

# MQTT Paho C library
git clone https://github.com/eclipse/paho.mqtt.c.git
cd paho.mqtt.c
git checkout v1.3.8
cmake -Bbuild -H. -DPAHO_ENABLE_TESTING=OFF -DPAHO_BUILD_STATIC=OFF -DPAHO_BUILD_SHARED=ON -DPAHO_WITH_SSL=ON -DPAHO_HIGH_PERFORMANCE=ON -DCMAKE_INSTALL_PREFIX="${NRP_DEPS_INSTALL_DIR}"
cmake --build build/ --target install
sudo ldconfig && cd ..

# MQTT Paho CPP
git clone https://github.com/eclipse/paho.mqtt.cpp.git
cd paho.mqtt.cpp
git checkout v1.2.0
cmake -Bbuild -H. -DPAHO_BUILD_STATIC=OFF -DPAHO_BUILD_SHARED=ON -DCMAKE_INSTALL_PREFIX="${NRP_DEPS_INSTALL_DIR}" -DCMAKE_PREFIX_PATH="${NRP_DEPS_INSTALL_DIR}"
cmake --build build/ --target install
sudo ldconfig && cd ..

# CUDA Support
# The EDLUT simulator supports running on CUDA GPUs. This option can be enabled if EDLUT_WITH_CUDA cmake option is set to ON while configuring nrp-core.
# It is highly recommended to install a CUDA version >=11.0 due to compatibility version with GCC9 (default compiler for Ubuntu 20.04)
# In order to ensure that you can follow these steps which install CUDA 12.0:
sudo apt-get --purge -y remove 'cuda*'
sudo apt-get --purge -y remove 'nvidia*'
wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/cuda-keyring_1.0-1_all.deb
sudo dpkg -i cuda-keyring_1.0-1_all.deb
sudo apt update
sudo apt install cuda
echo 'export PATH=/usr/local/cuda/bin${PATH:+:${PATH}}' >> ~/.bashrc

# End of dependencies installation

Installation

# Start of installation
git clone https://bitbucket.org/hbpneurorobotics/nrp-core.git
cd nrp-core
mkdir build
cd build

export LD_LIBRARY_PATH=${NRP_DEPS_INSTALL_DIR}/lib:${LD_LIBRARY_PATH}
# if you have ROS installed (Step 4 in dependencies installation), you need to source its setup.bash file before cmake. If you don't need ROS (and did not install it) skip the next line.
. /opt/ros/noetic/setup.bash
# make sure that NRP_INSTALL_DIR is set properly as mentioned at the beginning of tutorial
# See the section "Common NRP-core CMake options" in the documentation for the additional ways to configure the project with CMake
cmake .. -DCMAKE_INSTALL_PREFIX="${NRP_INSTALL_DIR}" -DNRP_DEP_CMAKE_INSTALL_PREFIX="${NRP_DEPS_INSTALL_DIR}"
mkdir -p "${NRP_INSTALL_DIR}"
# the installation process might take some time, as it downloads and compiles Nest as well.
# If you haven't installed MQTT libraries (Step 8 in dependencies installation), add ENABLE_MQTT=OFF definition to cmake (-DENABLE_MQTT=OFF).
make
make install
# just in case of wanting to build the documentation. Documentation can then be found in a new doxygen folder
make nrp_doxygen

# End of installation

Common NRP-core CMake options

Here is the list of the CMake options which can help to modify the project configuration (turn on and turn off the support of some components and features).

Developers options:

  • COVERAGE enables generation of the code coverage reports during the testing;

  • BUILD_RST enables generation of the reStructuredText source files from the Doxygen documentation.

Communication protocols options:

  • ENABLE_ROS enables compilation with the ROS support;

  • ENABLE_MQTT enables compilation with the MQTT support.

ENABLE_SIMULATOR and BUILD_SIMULATOR_ENGINE_SERVER options:

  • ENABLE_NEST and BUILD_NEST_ENGINE_SERVER;

  • ENABLE_GAZEBO and BUILD_GAZEBO_ENGINE_SERVER.

The ENABLE_SIMULATOR and BUILD_SIMULATOR_ENGINE_SERVER flags allow to disable the compilation of those parts of nrp-core that depends on or install a specific simulator (eg. gazebo, nest)

The expected behavior for each of these pairs flags is as follows:

  • the NRPCoreSim is always built regardless of any of the flags values.

  • if ENABLE_SIMULATOR is set to OFF:

    • the related simulator won’t be assumed to be installed in the system, ie. make won’t fail if it isn’t. Also it won’t be installed in the compilation process if this possibility is available (as in the case of nest)

    • The engines connected with this simulator won’t be built (nor client nor server components)

    • tests that would fail if the related simulator is not available won’t be built

  • if the ENABLE_SIMULATOR is set to ON and BUILD_SIMULATOR_ENGINE_SERVER is set to OFF: Same as above, but:

    • the engine clients connected to this simulator will be built. This means that they should not depend on or link to any specific simulator

    • the engine server side components might or might not be built, depending on if the related simulator is required at compilation time

  • if the both flags are set to ON the simulator is assumed to be installed or it will be installed from source if this option is available. All targets connected with this simulator will be built

This flag system allows to configure the resulting nrp-core depending on which simulators are available on the system, both for avoiding potential dependency conflicts between simulators and enforcing modularity, opening the possibility of having specific engine servers running on a different machine or inside containers.

Setting the environment

In order to properly set the environment to run experiments with NRP-core, please make sure to add the lines below to your ~/.bashrc file

export NRP_INSTALL_DIR="/home/${USER}/.local/nrp"
export NRP_DEPS_INSTALL_DIR="/home/${USER}/.local/nrp_deps"
source  ${NRP_INSTALL_DIR}/bin/.nrp_env
. /usr/share/gazebo-11/setup.sh
. /opt/ros/noetic/setup.bash

Steps for installing additional simulators

This section includes installation steps for simulators that may be used with Python JSON Engine and PySim Engine. The PySim engine allows to connect a set of simulators with Python interfaces with NRP-Core, these include OpenAI Gym, Mujoco, and OpenSim.

Installation of The Virtual Brain

The instructions below install TVB root and data directly from git repositories. It is also possible to install them via pip, but then certain features and data sets may not be accessible. Complete instructions can be found at tvb-root and tvb-data repository pages.

# Install a tool that aliases python3 as python. Needed for TVB installation
sudo apt install python-is-python3

# TVB data
mkdir $HOME/tvb
cd $HOME/tvb
git clone https://github.com/the-virtual-brain/tvb-data.git
cd tvb-data
sudo python3 setup.py develop

# TVB root
cd $HOME/tvb
git clone https://github.com/the-virtual-brain/tvb-root.git
cd tvb-root/tvb_build
./install_full_tvb.sh

# You may need to adjust your numpy version for TVB to work:
pip install numpy==1.21

OpenAI installation

For OpenAI installation (complete instructions at https://gym.openai.com/docs):

pip install gym pygame

Bullet installation

For Bullet installation (complete instructions at https://pybullet.org/wordpress/):

pip install pybullet

Mujoco installation

For Mujoco installation (complete instructions at https://mujoco.org):

MUJOCO_PATH=$HOME/.mujoco
WORKING_DIR=~/Documents/Tmujoco
sudo apt install -y libosmesa6-dev patchelf

mkdir -p $WORKING_DIR
cd $WORKING_DIR
wget https://mujoco.org/download/mujoco210-linux-x86_64.tar.gz -O mujoco.tar.gz
mkdir -p $MUJOCO_PATH
tar -xf mujoco.tar.gz -C $MUJOCO_PATH
rm mujoco.tar.gz

echo 'export LD_LIBRARY_PATH='$MUJOCO_PATH'/mujoco210/bin:$LD_LIBRARY_PATH' >> $HOME/.bashrc
echo 'export MUJOCO_PY_MUJOCO_PATH='$MUJOCO_PATH'/mujoco210/' >> $HOME/.bashrc
echo 'export LD_LIBRARY_PATH=/usr/lib/nvidia:$LD_LIBRARY_PATH' >> $HOME/.bashrc

source $HOME/.bashrc

cd $HOME
rm -r $WORKING_DIR
pip3 install mujoco_py
python3 -c "import mujoco_py"

OpenSim installation

For OpenSim installation (complete instructions at https://github.com/opensim-org/opensim-core):

# Install opensim dependecies that are available through apt

sudo apt install cmake doxygen git pip openjdk-8-jdk python3-dev wget build-essential libtool autoconf pkg-config gfortran libopenblas-dev freeglut3-dev libxi-dev libxmu-dev

# Install python dependencies.
# Version 1.21 of numpy is used to stay compatible with TVB

pip install numpy==1.21

# Create opensim directories

OPENSIM_ROOT=${HOME}/opensim

OPENSIM_INSTALL_PATH=${HOME}/opensim/install
OPENSIM_BUILD_PATH=${HOME}/opensim/build

OPENSIM_DEPS_INSTALL_PATH=${HOME}/opensim/dependencies_install
OPENSIM_DEPS_BUILD_PATH=${HOME}/opensim/dependencies_build

mkdir $OPENSIM_ROOT

mkdir $OPENSIM_BUILD_PATH
mkdir $OPENSIM_INSTALL_PATH

mkdir $OPENSIM_DEPS_BUILD_PATH
mkdir $OPENSIM_DEPS_INSTALL_PATH

# Compile and install (globally) the latest version of swig
# The version available through apt (4.0.1) is incompatible with the latest opensim

sudo apt install -y libpcre2-dev bison byacc
cd ${OPENSIM_ROOT}
git clone https://github.com/swig/swig
cd ${OPENSIM_ROOT}/swig
./autogen.sh
./configure
make -j4
sudo make install

# Clone opensim

# NOTE:
# Both opensim and its dependencies should be built in Release mode (CMAKE_BUILD_TYPE=Release)!
# Building with Debug Symbols makes the size of the resulting image unacceptable

cd ${OPENSIM_ROOT}
git clone https://github.com/opensim-org/opensim-core.git

# Build some of the dependencies (simbody, spdlog...) as part of OpenSim 'superbuild'
# OPENSIM_WITH_CASADI=ON and OPENSIM_WITH_TROPTER=ON switches will trigger
# builds of certain necessary dependencies, like casadi, adolc, colpack, etc.

cd ${OPENSIM_DEPS_BUILD_PATH}
cmake ../opensim-core/dependencies/ \
      -DCMAKE_INSTALL_PREFIX=${OPENSIM_DEPS_INSTALL_PATH} \
      -DCMAKE_BUILD_TYPE=Release \
      -DOPENSIM_WITH_CASADI=ON \
      -DOPENSIM_WITH_TROPTER=ON
make -j8
make -j8 install

# Fixes "opensim/modeling/SmoothSphereHalfSpaceForce.java:49: error: unmappable character for encoding ASCII"

export JAVA_TOOL_OPTIONS=-Dfile.encoding=UTF8

# Build opensim

cd ${OPENSIM_BUILD_PATH}
cmake ../opensim-core \
      -DCMAKE_INSTALL_PREFIX="${OPENSIM_INSTALL_PATH}" \
      -DCMAKE_BUILD_TYPE=Release \
      -DOPENSIM_DEPENDENCIES_DIR="${OPENSIM_DEPS_INSTALL_PATH}" \
      -DBUILD_PYTHON_WRAPPING=ON \
      -DBUILD_JAVA_WRAPPING=ON \
      -DWITH_BTK=ON

make -j8
make -j8 install

cd $HOME

# Export opensim python wrappers and packages

echo 'export PYTHONPATH=$HOME/opensim_install/lib/python3.8/site-packages/:$PYTHONPATH' >> .bashrc

# Export opensim libraries
# Some of the dependecies (ipopt, adolc) arent installed with 'make install', we have to export them too

echo 'export LD_LIBRARY_PATH=$HOME/opensim/dependencies_install/ipopt/lib/:$LD_LIBRARY_PATH' >> .bashrc
echo 'export LD_LIBRARY_PATH=$HOME/opensim/dependencies_install/adol-c/lib64/:$LD_LIBRARY_PATH' >> .bashrc
echo 'export LD_LIBRARY_PATH=$HOME/opensim/install/lib/:$LD_LIBRARY_PATH' >> .bashrc

source $HOME/.bashrc